Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding | Blogs | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing. The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government...
Encryption and nuclear weapons are two easily recognized examples where a combinatorial...
The 10-day tour of Europe was not your typical itinerary — Garching, Karlsruhe, Villigen,...
The Department of Energy’s National Energy Research Scientific Computing Center (NERSC) announced the winners of its second annual High Performance Computing (HPC) Achievement Awards on February 4, 2014, during the annual NERSC User Group meeting at Lawrence Berkeley National Laboratory (Berkeley Lab).
Lawrence Livermore has joined forces with two other national labs to deliver next generation supercomputers able to perform up to 200 peak petaflops (quadrillions of floating point operations per second), about 10 times faster than today's most powerful high performance computing (HPC) systems.
The Russian Ministry of Education and Science has awarded a $3.4 million “mega-grant” to Alexei Klimentov, Physics Applications Software Group Leader at the U.S. Department of Energy’s Brookhaven National Laboratory, to develop new “big data” computing tools for the advancement of science.
Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months. However, a team working with Beagle, one of the world's fastest supercomputers devoted to life sciences, reports that genome analysis can be radically accelerated. This computer is able to analyze 240 full genomes in about two days.
Multi-scale Simulation Software for Chemistry Research Developed Using Trestles and Gordon SupercomputersFebruary 19, 2014 6:48 pm | by San Diego Supercomputer Center | News | Comments
Researchers at the San Diego Supercomputer Center at the University of California, San Diego, have developed software that greatly expands the types of multi-scale QM%2FMM (mixed quantum and molecular mechanical) simulations of complex chemical systems that scientists can use to design new drugs, better chemicals, or improved enzymes for biofuels production.
Black holes may be dark, but the areas around them definitely are not. These dense, spinning behemoths twist up gas and matter just outside their event horizon, and generate heat and energy that gets radiated, in part, as light. And when black holes merge, they produce a bright intergalactic burst that may act as a beacon for their collision.
HPC matters, now more than ever. What better way to show how it matters than through your submission to the SC14 Technical Program? Technical Program submissions opened, February 14th for Research Papers, Posters (Regular, Education, and ACM Student Research Competition), Panels, Tutorials, BOF Sessions, Scientific Visualization and Data Analytics Showcase, Emerging Technologies, and Doctoral Showcase.
Researchers have found that the melanopsin pigment in the eye is potentially more sensitive to light than its more famous counterpart, rhodopsin, the pigment that allows for night vision. For more than two years, they have been investigating melanopsin, a retina pigment capable of sensing light changes in the environment, informing the nervous system and synchronizing it with the day/night rhythm.
Seeking a solution to decoherence — the “noise” that prevents quantum processors from functioning properly — scientists have developed a strategy of linking quantum bits together into voting blocks, a strategy that significantly boosts their accuracy. The team found that their method results in at least a five-fold increase in the probability of reaching the correct answer when the processor solves the largest problems
IBM has launched a 10-year initiative to bring Watson and other cognitive systems to Africa in a bid to fuel development and spur business opportunities across the world's fastest growing continent. Dubbed "Project Lucy" after the earliest known human ancestor, IBM will invest US$100 million in the initiative
Computational scientists now have the opportunity to apply for the upcoming Argonne Training Program on Extreme-Scale Computing (ATPESC), to take place from August 3-15, 2014. The program provides intensive hands-on training on the key skills, approaches and tools to design, implement, and execute computational science and engineering applications on current supercomputers and the HPC systems of the future.
The same physics that gives tornadoes their ferocious stability lies at the heart of new University of Washington research, and could lead to a better understanding of nuclear dynamics in studying fission, superconductors and the workings of neutron stars.
The HPC Advisory Council and the Swiss Supercomputing Centre will host the HPC Advisory Council Switzerland Conference 2014. The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.
In 2014, PRACE will organise its first Scientific and Industrial Conference – the first edition of the PRACE days - under the motto HPC for Innovation – when Science meets Industry. The conference combines the previously separate PRACE Scientific Conferences and PRACE Industrial Seminars and will bring together experts from academia and industry who will present their advancements in HPC-supported science and engineering.
The 21th annual IEEE International Conference on High Performance Computing (HiPC 2014) will be held at the Hotel Cidade De Goa in Goa, India, during December 17-December 20, 2014. It will serve as a forum for researchers from around the world to present their current research efforts and findings, and will act as a venue for stimulating discussions and highlighting high performance computing (HPC) related activities in Asia.
The National Energy Research Scientific Computing (NERSC) Center recently accepted “Edison,” a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) on February 5, 2014, and scientists are already reporting results.
A research group of scientists and engineers led by the University of Bristol, UK, has made an important advance toward a quantum computer by shrinking down key components and integrating them onto a silicon microchip. Scientists have, for the first time, generated and manipulated single particles of light (photons) on a silicon chip — a major step forward in the race to build a quantum computer.
Texas A&M System Teams with IBM to Drive Computational Sciences Research through Big Data and AnalyticsJanuary 29, 2014 1:38 pm | by IBM | News | Comments
Texas A&M University System and IBM announced an agreement that is the beginning of a broad research collaboration supported by one of the largest computational sciences infrastructures dedicated to advances in agriculture, geosciences and engineering.
On Monday, January 27, over 250 people gathered at the small evangelical church in Daisbach – a very small, quiet town in Southern Germany – to bid farewell to Hans Meuer, the founder of TOP500 and the ISC General Chair. Hans passed away on January 20 at the age of 77.
Inspired by nature, scientists from Berlin and Heidelberg use artifical nerve cells to classify different types of data. A bakery assistant who takes the bread from the shelf just to give it to his boss who then hands it over to the customer? Rather unlikely. Instead, both work at the same time to sell the baked goods.
The goal of this conference is to bring together all the developers and researchers involved in solving the software challenges of the exascale era. The conference focuses on issues of applications for exascale and the associated tools, software programming models and libraries.
Ab initio: "From the beginning." It is a term that's used in science to describe calculations that rely on established mathematical laws of nature, or "first principles," without additional assumptions or special models. But when it comes to the phenomena that Milos Milosavljevic is interested in calculating, we're talking really ab initio, as in: from the beginning of time onward.
ISC General Chair, Prof. Dr. Hans Werner Meuer passed away at the age of 77 at his home in Daisbach, Southern Germany, on January 20, 2014, after a brief battle with cancer. Meuer has been involved in data processing since 1960. He served as specialist, project leader, group and department chief during his 11 years, from 1962 – 1973, at the Research Center in Jülich, Germany.
IBM has announced plans to commit over $1.2 billion to significantly expand its global cloud footprint. This investment includes a network of data centers designed to bring clients greater flexibility, transparency and control over how they manage their data, run their businesses and deploy their IT operations in the cloud.
Outshining the black holes they surround, the bright, hot centers of galaxies known as active galactic nuclei can spew jets of plasma thousands of light-years long. These streams of plasma create an effect often seen in popular images — galaxies speared through the heart by intense light. Such jets are also associated with stars and other astronomical phenomenon.
- Page 1