Registration is now open for a workshop on “Improving Data Mobility and Management for International Cosmology” to be held February 10-11, 2015, at Lawrence Berkeley National Laboratory in California. The workshop, one in a series of Cross-Connects workshops, is sponsored the by the Deptartment of Energy’s ESnet and Internet2. Early registration is encouraged, as attendance is limited.
Scientists using supercomputers found genes sensitive to cold and drought in a plant help it...
Just because concrete is the most widely used building material in human history doesn’t mean it...
The HPC Advisory Council and the Swiss Supercomputing Centre will host the HPC Advisory Council Switzerland Conference 2015 in the Lugano Convention Centre, Lugano, Switzerland, from March 23 - March 25, 2015. The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.
The aim of this conference is to bring together all of the stakeholders involved in solving the software challenges of the exascale — from application developers, through numerical library experts, programming model developers and integrators, to tools designers. EASC2015 is organised by EPCC at the University of Edinburgh.
It’s no secret that finding good talent is hard. It’s even harder in the HPC and scientific community. To help bridge the gap between the next wave of HPC professionals and the commercial vendors that require their talent, the HPC Advisory Council has joined forces with ISC Events to host the fourth HPCAC-ISC Student Cluster Competition 2015.
The NCSA is enabling software heavily used in industry to run faster, and it’s creating competitive advantages for some of the nation’s largest companies. Industry is a heavy user of supercomputing: it is central to the business of companies within diverse sectors such as oil and gas, pharmaceutical, aerospace and automotive.
In 2005, a semi-truck hauling 35,000 pounds of explosives through the Spanish Fork Canyon in Utah crashed and caught fire, causing a dramatic explosion that left a 30-by-70-foot crater in the highway. Fortunately, there were no fatalities. Such accidents are extremely rare but can, obviously, have devastating results. So, understanding better exactly how such explosions occur can be an important step to learning how better to prevent them.
The 56th HPC User Forum will take place from April 13-15, 2015, at the Marriott Norfolk Waterside in Norfolk, Virginia.
Environmental Intelligence: Significant Investment in Next-Gen Supercomputers to Improve Weather ForecastsJanuary 6, 2015 12:26 pm | by NOAA | News | Comments
NOAA has announced the next phase in the agency’s efforts to increase supercomputing capacity to provide more timely, accurate, reliable and detailed forecasts. By October 2015, the capacity of each of NOAA’s two operational supercomputers will jump to 2.5 petaflops, for a total of 5 petaflops — a nearly tenfold increase from the current capacity.
Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost. Researchers developed new algorithmic features that streamline turbulent flame simulations, which play an important role in designing more efficient combustion systems. They tested the enhanced code on the Hopper supercomputer and achieved a dramatic decrease in simulation times.
Not long ago, it would have taken several years to run a high-resolution simulation on a global climate model. But using supercomputing resources at NERSC, climate scientist Michael Wehner was able to complete a run in just three months. What he found was that not only were the simulations much closer to actual observations, but the high-resolution models were far better at reproducing intense storms, such as hurricanes and cyclones.
Researchers have detected at least three instances of cross-species mating that likely influenced the evolutionary paths of “old world” mice, two in recent times and one in the distant past. They think these instances of introgressive hybridization are only the first of many needles waiting to be found in a very large genetic haystack. The finding suggests that hybridization in mammals may not be an evolutionary dead end.
ISC has announced the ISC Cloud & Big Data conference, which has merged into a three-day event to take place in Frankfurt, Germany, on September 28 to 30, 2015. The new format offers attendees two full days of multi-track sessions, highlighting current and future technologies, and applications most relevant in the cloud and big data fields. In addition, there will be one full day of workshops.
About 95 percent of the more than 10,000 bird species known only evolved upon the extinction of dinosaurs about 66 million years ago. According to computer analyses of the genetic data, today's diversity developed from a few species at a virtually explosive rate after 15 million years. Scientists designed the algorithms for the comprehensive analysis of the evolution of birds; a computing capacity of 300 processor-years was required.
The genomes of modern birds tell a story of how they emerged and evolved after the mass extinction that wiped out dinosaurs and almost everything else 66 million years ago. That story is now coming to light, thanks to an ambitious international collaboration that has been underway for four years. The first findings of the Avian Phylogenomics Consortium are being reported nearly simultaneously in 28 papers.
Celebrating its 30th conference anniversary, ISC High Performance has announced that the 2015 program’s technical content “will be strikingly broad in subject matter, differentiated and timely.” Over 2,600 attendees will gather in Frankfurt, from July 12 to 16, to discuss their organizational needs and the industry’s challenges, as well as learn about the latest research, products and solutions.
The US Department of Energy is mining for solutions to the rare earth problem — but with high-performance computing instead of bulldozers. Researchers are using the hybrid CPU-GPU, 27-petaflop Titan supercomputer managed by the Oak Ridge Leadership Computing Facility at Oak Ridge National Laboratory to discover alternative materials that can substitute for rare earths.
What does a black hole look like up close? As the sci-fi movie Interstellar wows audiences with its computer-generated views of one of most enigmatic and fascinating phenomena in the universe, University of Arizona (UA) astrophysicists Chi-kwan Chan, Dimitrios Psaltis and Feryal Ozel are likely nodding appreciatively and saying something like, "Meh, that looks nice, but check out what we've got."
An interstellar mystery of why stars form has been solved thanks to the most realistic supercomputer simulations of galaxies yet made. Theoretical astrophysicist Philip Hopkins led research that found that stellar activity — like supernova explosions or even just starlight — plays a big part in the formation of other stars and the growth of galaxies.
In a real-time challenge, the 11 teams of undergraduate students will build a small cluster of their own design on the ISC 2015 exhibit floor and race to demonstrate the greatest performance across a series of benchmarks and applications. It all concludes with a ceremony on the main conference keynote stage to award and recognize all student participants in front of thousands of HPC luminaries.
At the SC14 conference, which took place recently in New Orleans, IDC’s HPC Innovation Excellence Award Program continued to showcase benefits of investment in high performance computing (HPC). Initiated in 2011 to recognize innovative achievements using HPC, the program is designed to provide a means to evaluate the economic and scientific value HPC systems contribute.
The Intelligence Advanced Research Projects Activity (IARPA), within the Office of the Director of National Intelligence (ODNI), has embarked on a multi-year research effort to develop a superconducting computer. If successful, technology developed under the Cryogenic Computer Complexity (C3) program will pave the way to a new generation of superconducting supercomputers that are far more energy efficient.
Leaders in science, engineering, government, and industry will address fast-moving opportunities and challenges in the field of “big data” at the Virginia Summit on Science, Engineering, and Medicine.
A team of researchers from Argonne National Laboratory and DataDirect Networks (DDN) moved 65 terabytes of data in under just 100 minutes at a recent supercomputing conference.
Twice each year, a ranking of general purpose systems that are in common use for high-end applications is compiled and published by the TOP500 Project. Most recently released at the SC14 conference, this “much anticipated, much watched and much debated twice-yearly event” reveals the 500 most powerful commercially available computer systems, ranked by their performance on the LINPACK Benchmark.
For their calculations, researchers at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) will now, starting in 2015, have access to the World’s second-fastest computer. The Dresden scientists are hoping that the computations will yield new insights that may prove useful in proton-based cancer therapy.
In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.
- Page 1