In the middle of the 19th century, the massive binary system Eta Carinae underwent an eruption that ejected at least 10 times the sun's mass and made it the second-brightest star in the sky. Now, a team of astronomers has used extensive new observations to create the first high-resolution 3-D model of the expanding cloud produced by this outburst.
The National Nuclear Security Administration (NNSA) and Cray have entered into a contract...
With over 700 new functions — the single biggest jump in new functionality in the software's...
In nearly every field of science, experiments, instruments, observations, sensors, simulations,...
A stunning video based on fossils of a 410-million-year-old arachnid — one of the first predators on land — recreates the animal walking. Researchers used exceptionally preserved fossils from the Natural History Museum in London to create the video showing the most likely walking gait of the animal; the study is published in a special issue of the Journal of Paleontology.
At a busy shopping mall, shoppers walk by store windows to find attractive items to purchase. Through the windows, shoppers can see the products displayed, but may have a hard time imagining doing something beyond just looking, such as touching the displayed items or communicating with sales assistants inside the store. With TransWall, however, window shopping could become more fun and real than ever before.
A powerful new model to detect life on planets outside of our solar system, more accurately than ever before, has been developed by UCL researchers. The new model focuses on methane, the simplest organic molecule, widely acknowledged to be a sign of potential life.
Discovering a brain tumor is a very serious issue, but it is not the end of the story. There are many different types of brain tumor with different survival rates and different methods for treatment. However, today, many brain tumors are difficult to clearly diagnose, leading to poor prognoses for patients.
Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.
Fully automated “deep learning” by computers greatly improves the odds of discovering particles such as the Higgs boson, beating even veteran physicists’ abilities.
A new type of steel-reinforced concrete protects buildings better from bomb attacks. Researchers have developed a formula to quickly calculate the concrete’s required thickness. The material will be used in the One World Trade Center at Ground Zero.
A relic from long before the age of supercomputers, the 169-year-old math strategy called the Jacobi iterative method is widely dismissed today as too slow to be useful. But thanks to a curious, numbers-savvy engineering student and his professor, it may soon get a new lease on life.
"Big data" is playing an increasingly big role in the renewable energy industry and the transformation of the nation's electrical grid, and no single entity provides a better tool for such data than the Energy Department's Energy Systems Integration Facility (ESIF) located on the campus of the National Renewable Energy Laboratory (NREL).
By combining advanced mathematics with high-performance computing, scientists have developed a tool that allowed them to calculate a fundamental property of most atoms on the periodic table to historic accuracy — reducing error by a factor of a thousand in many cases.
Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.
To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.
A new mathematical model could help engineers control the formation of wrinkle, crease, and fold structures in a wide variety of materials. It may also help scientists understand how these structures form in nature.
In the coming decades, we will likely commute to work and explore the countryside in autonomous, or driverless, cars capable of communicating with the roads they are traveling on. A convergence of technological innovations in embedded sensors, computer vision, artificial intelligence, control and automation, and computer processing power is making this feat a reality.
Machine learning, in which computers learn new skills by looking for patterns in training data, is the basis of most recent advances in artificial intelligence, from voice-recognition systems to self-parking cars. It’s also the technique that autonomous robots typically use to build models of their environments. That type of model-building gets complicated, however, in cases in which clusters of robots work as teams.
Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution to a given problem.
Supercomputer simulations have shown that clusters of a protein linked to cancer warp cell membranes. This research on these protein clusters, or aggregates as scientists call them, could help guide design of new anticancer drugs.
Theoretical physicists at Saarland University have developed a method that enables quantum computers to be powered up and running stably in just five minutes – something that took six hours to achieve previously. This huge time reduction has been achieved by making use of mathematical models.
HPC Innovation Excellence Award: Rolls-Royce, Procter and Gamble, National Center for Supercomputing Applications, Cray Inc., Livermore Software Technology CorporationJune 23, 2014 6:00 pm | Award Winners
Researchers from NCSA, Rolls Royce, Proctor and Gamble, Cray Inc. and Livermore Software Technology Corporation were able to scale the commercial explicit finite element code, LS-DYNA, to 15,000 cores on Blue Waters.
Researchers from Westinghouse Electric Company and the Consortium for Advanced Simulation of LWRs (CASL), a U.S. Department of Energy (DOE) Innovation Hub, performed core physics simulations of the AP1000® PWR startup core using CASL’s Virtual Environment for Reactor Application (VERA).
HPC Innovation Excellence Award: Culham Centre for Fusion Energy, EPCC at the University of Edinburgh, York Plasma Institute at the University of York, and Lund UniversityJune 23, 2014 5:36 pm | Award Winners
Researchers from CCFE, EPCC and the Universities of York and Lund have made substantial recent optimizations for the well-known plasma turbulence code, GS2. This included a total rewrite of the routines that calculate the response matrices required by the code's implicit algorithm, which has significantly accelerated GS2’s initialization, typically by a factor over 10.
Engineers and scientists from Pipistrel utilized HPC and technical computing resources to design and develop the Taurus G4 aircraft. The aircraft was conceived, designed and built in a mere 5 months, relying heavily on CAD and rapid prototyping techniques, but especially on the use of CFD and other computer aerodynamic tools for evaluation of flight performance and handling before committing to the building of the prototype.
Engineers from THESAN srl, an Italian SME active in the renewable energy sector, teamed up with the Italian supercomputing center CINECA to develop simulation driven engineering of hydroelectric turbines. The engineers and researchers built an HPC based workflow to optimize the design of a new class of hydroelectric turbines.
Researchers from Argonne National Laboratory conducted one of the largest internal combustion engine simulations. Predictive internal combustion engine simulations necessitate very fine spatial and temporal resolutions, high-fidelity and robust two-phase flow, spray, turbulence, combustion, and emission models.
University of Wisconsin Researchers utilized HPC resources in combination with multiple advanced forms of protein structure prediction algorithms and deep sequence data mining to construct a highly plausible capsid model for Rhinovirus-C (~600,000 atoms). The simulation model helps researchers in explaining why the existing pharmaceuticals don’t work on this virus.
- Page 1