Advertisement
Supercomputers
Subscribe to Supercomputers

The Lead

Improving Data Mobility and Management for International Cosmology

Improving Data Mobility and Management for International Cosmology Workshop

January 28, 2015 3:06 pm | by Lawrence Berkeley National Laboratory | Events

Registration is now open for a workshop on “Improving Data Mobility and Management for International Cosmology” to be held February 10-11, 2015, at Lawrence Berkeley National Laboratory in California. The workshop, one in a series of Cross-Connects workshops, is sponsored the by the Deptartment of Energy’s ESnet and Internet2. Early registration is encouraged, as attendance is limited.

Needle in a Haystack: Finding the Right Genes in Tens of Thousands

January 28, 2015 2:45 pm | by TACC | News | Comments

Scientists using supercomputers found genes sensitive to cold and drought in a plant help it...

Predicting Concrete Flow Properties from Simple Measurements

January 23, 2015 2:44 pm | by NIST | News | Comments

Just because concrete is the most widely used building material in human history doesn’t mean it...

Puzzle: Does Glass Ever Stop Flowing?

January 22, 2015 2:15 pm | by University of Bristol | News | Comments

Is glass a true solid? Researchers have combined computer simulation and information theory,...

View Sample

FREE Email Newsletter

The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.

HPC Advisory Council Switzerland Conference 2015

January 20, 2015 10:10 am | by HPC Advisory Council | Events

The HPC Advisory Council and the Swiss Supercomputing Centre will host the HPC Advisory Council Switzerland Conference 2015 in the Lugano Convention Centre, Lugano, Switzerland, from March 23 - March 25, 2015. The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.

The aim of this conference is to bring together all stakeholders involved in solving the software challenges of the exascale – from application developers, through numerical library experts, programming model developers and integrators, to tools designers

EASC2015: Solving Software Challenges for Exascale

January 20, 2015 10:01 am | by University of Edinburg | Events

The aim of this conference is to bring together all of the stakeholders involved in solving the software challenges of the exascale — from application developers, through numerical library experts, programming model developers and integrators, to tools designers. EASC2015 is organised by EPCC at the University of Edinburgh.

Students who take part in these competitions jumpstart their careers in technology and research. They typically spend six months of their time preparing for this competition, all the while learning an incredible amount of skills in software, hardware and

Join us in Sponsoring the HPC Workforce of the Future

January 15, 2015 8:55 am | by Brian Sparks, HPC Advisory Council Media Relations and Events Director | Blogs | Comments

It’s no secret that finding good talent is hard. It’s even harder in the HPC and scientific community. To help bridge the gap between the next wave of HPC professionals and the commercial vendors that require their talent, the HPC Advisory Council has joined forces with ISC Events to host the fourth HPCAC-ISC Student Cluster Competition 2015.

Advertisement
Merle Giles, Director of NCSA Private Sector Programs and Economic Impact; Rob Rick, VP Sales Americas of Allinea Software

Supercomputing Creates Competitive Advantages in U.S. Industrial R&D

January 8, 2015 2:26 pm | by Allinea Software | News | Comments

The NCSA is enabling software heavily used in industry to run faster, and it’s creating competitive advantages for some of the nation’s largest companies. Industry is a heavy user of supercomputing: it is central to the business of companies within diverse sectors such as oil and gas, pharmaceutical, aerospace and automotive.

The top figure shows the progression of deflagration through the explosive cylinders (light blue) transitioning to detonation (0.710 msec). The dark blue region shows the position of the 2D pressure slice illustrated in the bottom figure. Courtesy of Jacq

Large-scale 3-D Simulations Aim at Safer Transport of Explosives

January 7, 2015 2:20 pm | by Jim Collins, Argonne Leadership Computing Facility | News | Comments

In 2005, a semi-truck hauling 35,000 pounds of explosives through the Spanish Fork Canyon in Utah crashed and caught fire, causing a dramatic explosion that left a 30-by-70-foot crater in the highway. Fortunately, there were no fatalities. Such accidents are extremely rare but can, obviously, have devastating results. So, understanding better exactly how such explosions occur can be an important step to learning how better to prevent them.

The 56th HPC User Forum will take place from April 13-15, 2015, at the Marriott Norfolk Waterside in Norfolk, Virginia.

IDC HPC User Forum, Norfolk, Virginia

January 6, 2015 4:28 pm | by IDC | Events

The 56th HPC User Forum will take place from April 13-15, 2015, at the Marriott Norfolk Waterside in Norfolk, Virginia.

NOAA's supercomputer upgrades will provide more timely, accurate weather forecasts.

Environmental Intelligence: Significant Investment in Next-Gen Supercomputers to Improve Weather Forecasts

January 6, 2015 12:26 pm | by NOAA | News | Comments

NOAA has announced the next phase in the agency’s efforts to increase supercomputing capacity to provide more timely, accurate, reliable and detailed forecasts. By October 2015, the capacity of each of NOAA’s two operational supercomputers will jump to 2.5 petaflops, for a total of 5 petaflops — a nearly tenfold increase from the current capacity.

This simulation illustrates the total mass density (left) and temperature (right) of a dimethyl ether jet fuel simulation. It is a snapshot of the solution that corresponds to a physical time of 0.00006 seconds. Courtesy of Matthew Emmett, Weiqun Zhang

Optimized Algorithms Give Combustion Simulations a Boost

December 18, 2014 4:32 pm | by Lawrence Berkeley National Laboratory | News | Comments

Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost. Researchers developed new algorithmic features that streamline turbulent flame simulations, which play an important role in designing more efficient combustion systems. They tested the enhanced code on the Hopper supercomputer and achieved a dramatic decrease in simulation times.

Advertisement
Simulated and observed annual maximum five-day accumulated precipitation over land points, averaged. Observations are calculated from the period 1979 to 1999. Model results are calculated from the period 1979 to 2005.

Global High-resolution Models Fuel New Golden Age of Climate Science

December 18, 2014 4:14 pm | by Lawrence Berkeley National Laboratory | News | Comments

Not long ago, it would have taken several years to run a high-resolution simulation on a global climate model. But using supercomputing resources at NERSC, climate scientist Michael Wehner was able to complete a run in just three months. What he found was that not only were the simulations much closer to actual observations, but the high-resolution models were far better at reproducing intense storms, such as hurricanes and cyclones.

The species used in a Rice University genetic study of mice were collected from 15 locations in Europe and Africa. The green region indicates the range of Mus spretus, the Algerian mouse, while the blue region indicates the range of Mus musculus domesticu

Big Data Analysis Reveals Shared Genetic Code between Species

December 18, 2014 11:32 am | by Mike Williams, Rice University | News | Comments

Researchers have detected at least three instances of cross-species mating that likely influenced the evolutionary paths of “old world” mice, two in recent times and one in the distant past. They think these instances of introgressive hybridization are only the first of many needles waiting to be found in a very large genetic haystack. The finding suggests that hybridization in mammals may not be an evolutionary dead end.

ISC has announced the ISC Cloud & Big Data conference, which has merged into a three-day event to take place in Frankfurt, Germany, on September 28 to 30, 2015.

ISC Cloud & Big Data Conferences to Merge in 2015

December 16, 2014 12:11 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

ISC has announced the ISC Cloud & Big Data conference, which has merged into a three-day event to take place in Frankfurt, Germany, on September 28 to 30, 2015. The new format offers attendees two full days of multi-track sessions, highlighting current and future technologies, and applications most relevant in the cloud and big data fields. In addition, there will be one full day of workshops.

For the family of bee-eaters (on the photo Merops bullocki), the study revealed a close relationship to oscine birds, parrots, and birds of prey. Courtesy of Peter Houde

Bird Tree of Life Reproduced using Gene Analysis, Supercomputing

December 15, 2014 1:57 pm | by Karlsruhe Institute of Technology | News | Comments

About 95 percent of the more than 10,000 bird species known only evolved upon the extinction of dinosaurs about 66 million years ago. According to computer analyses of the genetic data, today's diversity developed from a few species at a virtually explosive rate after 15 million years. Scientists designed the algorithms for the comprehensive analysis of the evolution of birds; a computing capacity of 300 processor-years was required.

Duke researchers led by associate professor of neurobiology Erich Jarvis, left, did most of the DNA extraction from bird tissue samples used in the Avian Phylogenomics Consortium. (l-r: Carole Parent, Nisarg Dabhi, Jason Howard). Courtesy of Les Todd, Duk

Mapping the "Big Bang" of Bird Evolution

December 12, 2014 6:04 pm | by Kelly Rae Chi, Duke University | News | Comments

The genomes of modern birds tell a story of how they emerged and evolved after the mass extinction that wiped out dinosaurs and almost everything else 66 million years ago. That story is now coming to light, thanks to an ambitious international collaboration that has been underway for four years. The first findings of the Avian Phylogenomics Consortium are being reported nearly simultaneously in 28 papers.

Advertisement
ISC High Performance is the only yearly international HPC forum that introduces over 300 hand-picked speakers to their attendees.

ISC High Performance Program to Offer Greater Diversity

December 12, 2014 4:32 pm | by ISC | News | Comments

Celebrating its 30th conference anniversary, ISC High Performance has announced that the 2015 program’s technical content “will be strikingly broad in subject matter, differentiated and timely.” Over 2,600 attendees will gather in Frankfurt, from July 12 to 16, to discuss their organizational needs and the industry’s challenges, as well as learn about the latest research, products and solutions.

Results of large-scale simulations showing the Alnico alloy separates into FeCo-rich and NiAl-rich phases at low temperatures and is a homogenized phase at high temperatures.

Solving the Shaky Future of Super-strong Rare Earth Magnets

December 11, 2014 4:15 pm | by Katie Elyce Jones, Oak Ridge National Laboratory | News | Comments

The US Department of Energy is mining for solutions to the rare earth problem — but with high-performance computing instead of bulldozers. Researchers are using the hybrid CPU-GPU, 27-petaflop Titan supercomputer managed by the Oak Ridge Leadership Computing Facility at Oak Ridge National Laboratory to discover alternative materials that can substitute for rare earths.

A black hole as depicted in the movie Interstellar -- Courtesy of Paramount Pictures

A Supermassive Black Hole Comes to the Big Screen

December 11, 2014 3:34 pm | by University of Arizona | News | Comments

What does a black hole look like up close? As the sci-fi movie Interstellar wows audiences with its computer-generated views of one of most enigmatic and fascinating phenomena in the universe, University of Arizona (UA) astrophysicists Chi-kwan Chan, Dimitrios Psaltis and Feryal Ozel are likely nodding appreciatively and saying something like, "Meh, that looks nice, but check out what we've got."

Galactic gas from the Feedback in Realistic Environments (FIRE) simulation. Represented here is a Milky Way mass halo, with colors denoting different densities.

Interstellar Mystery Solved by Supercomputer Simulations

December 10, 2014 4:25 pm | by Jorge Salazar, Texas Advanced Computing Center | News | Comments

An interstellar mystery of why stars form has been solved thanks to the most realistic supercomputer simulations of galaxies yet made. Theoretical astrophysicist Philip Hopkins led research that found that stellar activity — like supernova explosions or even just starlight — plays a big part in the formation of other stars and the growth of galaxies.

The HPCAC-ISC Student Cluster Competition is an opportunity to showcase the world’s brightest computer science students’ expertise in a friendly, yet spirited competition.

University Teams for HPCAC-ISC 2015 Student Cluster Competition Announced

December 10, 2014 3:45 pm | by ISC | News | Comments

In a real-time challenge, the 11 teams of undergraduate students will build a small cluster of their own design on the ISC 2015 exhibit floor and race to demonstrate the greatest performance across a series of benchmarks and applications. It all concludes with a ceremony on the main conference keynote stage to award and recognize all student participants in front of thousands of HPC luminaries.

While most genome centers focus on research, the CPGM develops new clinical tests as a starting point for next‐generation medical treatments to improve outcomes in patients.

Applying HPC to Improve Business ROI

December 9, 2014 3:31 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

At the SC14 conference, which took place recently in New Orleans, IDC’s HPC Innovation Excellence Award Program continued to showcase benefits of investment in high performance computing (HPC). Initiated in 2011 to recognize innovative achievements using HPC, the program is designed to provide a means to evaluate the economic and scientific value HPC systems contribute.

The Intelligence Advanced Research Projects Activity (IARPA), within the Office of the Director of National Intelligence (ODNI), has embarked on a multi-year research effort to develop a superconducting computer. If successful, technology developed under

IARPA to Develop a Superconducting SuperComputer

December 5, 2014 4:24 pm | by IARPA | News | Comments

The Intelligence Advanced Research Projects Activity (IARPA), within the Office of the Director of National Intelligence (ODNI), has embarked on a multi-year research effort to develop a superconducting computer. If successful, technology developed under the Cryogenic Computer Complexity (C3) program will pave the way to a new generation of superconducting supercomputers that are far more energy efficient.

Leaders in science, engineering, government, and industry will address fast-moving opportunities and challenges in the field of “big data” at the Virginia Summit on Science, Engineering, and Medicine.

Big Data Challenges at Virginia Academy Summit

December 4, 2014 5:25 pm | by Virginia Tech | News | Comments

Leaders in science, engineering, government, and industry will address fast-moving opportunities and challenges in the field of “big data” at the Virginia Summit on Science, Engineering, and Medicine.              

A team of researchers from Argonne National Laboratory and DataDirect Networks (DDN) moved 65 terabytes of data in under just 100 minutes at a recent supercomputing conference.

Argonne Researchers Demonstrate Extraordinary Throughput at SC14

December 3, 2014 3:34 pm | by Computation Institute | News | Comments

A team of researchers from Argonne National Laboratory and DataDirect Networks (DDN) moved 65 terabytes of data in under just 100 minutes at a recent supercomputing conference.                     

Most recently released at the SC14 conference, this “much anticipated, much watched and much debated twice-yearly event” reveals the 500 most powerful commercially available computer systems, ranked by their performance on the LINPACK Benchmark.

Ranking the World’s Top Supercomputers

December 3, 2014 12:42 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

Twice each year, a ranking of general purpose systems that are in common use for high-end applications is compiled and published by the TOP500 Project. Most recently released at the SC14 conference, this “much anticipated, much watched and much debated twice-yearly event” reveals the 500 most powerful commercially available computer systems, ranked by their performance on the LINPACK Benchmark.

For their calculations, researchers at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) will now, starting in 2015, have access to the World’s second-fastest computer. The Dresden scientists are hoping that the computations will yield new insights that may

Titan Calculates for HZDR Cancer Research

December 2, 2014 3:21 pm | by HZDR | News | Comments

For their calculations, researchers at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) will now, starting in 2015, have access to the World’s second-fastest computer. The Dresden scientists are hoping that the computations will yield new insights that may prove useful in proton-based cancer therapy.

In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing

Computer Equal To or Better Than Humans at Cataloging Science

December 2, 2014 2:53 pm | by David Tenenbaum, University of Wisconsin-Madison | News | Comments

In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading