SDSC Joins Intel Parallel Computing Centers Program with Focus on Molecular Dynamics, Neuroscience and Life SciencesSeptember 12, 2014 2:44 pm | by San Diego Supercomputer Center | News | Comments
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, is working with semiconductor chipmaker Intel to further optimize research software to improve the parallelism, efficiency, and scalability of widely used molecular and neurological simulation technologies.
It may look like fresh blood and flow like fresh blood, but the longer blood is stored, the less...
University of Hawaii at Manoa astronomer R. Brent Tully has led an international team of...
Hummingbirds can hover so well they seem to float in mid-air. With the help of a supercomputer, Vanderbilt University mechanical engineer Haoxiang Luo has fleshed out some of the secrets of how hummingbirds hover, flight that's more similar to that of an insect than the typical bird.
NASA's Tropical Rainfall Measuring Mission or TRMM Satellite provided a look under the hood of Hurricane Cristobal as it moved north and paralleling the U.S. East Coast. NASA's HS3 hurricane mission also investigated the storm. Cristobal is close enough to the coast to trigger high surf advisories.
MIT spinout Akselos has developed novel software, based on years of research at the Institute, which uses precalculated supercomputer data for structural components — like simulated “Legos” — to solve FEA models in seconds. Hundreds of engineers in the mining, power-generation, and oil and gas industries are now using Akselos software.
Argonne National Laboratory was one of seven new winners of the HPC Innovation Excellence Award. Announced by International Data Corporation at the ISC '14 supercomputer industry conference in Leipzig, Germany, the award recognizes noteworthy achievements by users of high-performance computing (HPC) technologies.
New research by University of Montana doctoral student Jared Oyler provides improved computer models for estimating temperature across mountainous landscapes. Oyler provided a new climate dataset for ecological and hydrological research and natural resource management.
Recently, the Harvard-Smithsonian Center for Astrophysics unveiled an unprecedented simulation of the universe’s development. Called the Illustris project, the simulation depicts more than 13 billion years of cosmic evolution across a cube of the universe that’s 350-million-light-years on each side. But why was it important to conduct such a simulation?
Creating a realistic computer simulation of how light suffuses a room is crucial not just for animated movies like Toy Story or Cars, but also in industry. Special computing methods should ensure this, but require great effort. Computer scientists from Saarbrücken have developed a novel approach that vastly simplifies and speeds up the whole calculating process.
Physicists have created a tractor beam on water, providing a radical new technique that could confine oil spills, manipulate floating objects or explain rips at the beach. The group discovered they can control water flow patterns with simple wave generators, enabling them to move floating objects at will. The new technique gives scientists a way of controlling things adrift on water in a way that resembles sci-fi tractor beams.
A team representing Westinghouse Electric Company and the Consortium for Advanced Simulation of Light Water Reactors, a DOE Innovation Hub led by Oak Ridge National Laboratory, has received an HPC Innovation Excellence Award for applied simulation on Titan, the nation’s most powerful supercomputer. The award recognizes achievements made by industry users of high-performance computing technologies.
Using NASA's Hubble Space Telescope, a team of astronomers has spotted a star system that could have left behind a "zombie star" after an unusually weak supernova explosion. A supernova typically obliterates the exploding white dwarf, or dying star. On this occasion, scientists believe this faint...
When the space shuttle Columbia disintegrated on re-entry in 2002, sophisticated computer models were key to determining what happened. A piece of foam flew off at launch and hit a tile, damaging the leading edge of the shuttle wing and exposing the underlying structure. Temperatures soared to thousands of degrees as Columbia plunged toward Earth at 27 times the speed of sound, said Gallis, who used NASA codes and Icarus for simulations...
FLOW-3D 11 features FlowSight, an advanced visualization tool based on the EnSight post-processor, which offers powerful ways to analyze, visualize and communicate simulation data. Its capabilities include the ability to analyze and compare multiple simulation results simultaneously, volume rendering and a CFD calculator, as well as flipbooks.
SystemModeler 4 is a physical modeling and simulation environment for cyber-physical systems. Using drag-and-drop from a large selection of built-in and expandable modeling libraries, users can build multi-domain models of their complete system.
Altair has announced its intent to acquire Visual Solutions, makers of VisSim, a visual language for mathematical modeling, simulation and model-based embedded system development used by scientists and engineers. The transaction is expected to close by the end of July 2014.
The Columbia River basin in the Pacific Northwest offers great potential for water power; hydroelectric power stations there generate over 20 000 megawatts already. Now a simulation model will help optimize the operation of the extensive dam system.
Every trillionth of a second, Panagiotis Grammatikopoulos calculates the location of each individual atom in a particle based on where it is and which forces apply. He uses a computer program to make the calculations, and then animates the motion of the atoms using visualization software. The resulting animation illuminates what happens, atom-by-atom, when two nanoparticles collide.
Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple simulations using slightly different initial conditions or assumptions, and then analyze them together to try to improve forecasts. Using Japan’s K computer, researchers have succeeded in running 10,240 parallel simulations of global weather, the largest number ever performed, using data assimilation to reduce the range of uncertainties.
Using Powerful GPU-Based Monte Carlo Simulation Engine to Model Larger Systems, Reduce Data Errors, Improve System PrototypingJuly 22, 2014 8:33 am | by Jeffrey Potoff and Loren Schwiebert | Blogs | Comments
Recently, our research work got a shot in the arm because Wayne State University was the recipient of a complete high-performance compute cluster donated by Silicon Mechanics as part of its 3rd Annual Research Cluster Grant competition. The new HPC cluster gives us some state-of-the-art hardware, which will enhance the development of what we’ve been working on — a novel GPU-Optimized Monte Carlo simulation engine for molecular systems.
Dassault Systèmes announced the acquisition of SIMPACK, a multi-body simulation technologies and solutions company. With the acquisition of SIMPACK, based near Munich, Germany, Dassault Systèmes is expanding its SIMULIA realistic multiphysics simulation technology portfolio to include multi-body mechatronic systems, from virtual concept validation to the real-time experience.
Michael M. Resch, the Director of the Stuttgart High Performance Computing Center (HLRS) will be talking about “HPC and Simulation in the Cloud – How Academia and Industry Can Benefit.” His keynote is of special interest to cloud skeptics, given that prior to 2011, Resch himself was a vocal cloud pessimist. Three years later, he feels that this technology provides a practical option for many users.
How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...
The National Nuclear Security Administration (NNSA) and Cray have entered into a contract agreement for a next-generation supercomputer, called Trinity, to advance the mission for the Stockpile Stewardship Program. Managed by NNSA, Trinity is a joint effort of the New Mexico Alliance for Computing at Extreme Scale between Los Alamos and Sandia national laboratories as part of the NNSA Advanced Simulation and Computing Program.
EPCC is delighted to be part of a team that has won an. Presented at the International Supercomputing Conference (ISC14) in Leipzig (22-26 June 2014), the awards recognize outstanding application of HPC Computing for Business and Scientific Achievements.
A relic from long before the age of supercomputers, the 169-year-old math strategy called the Jacobi iterative method is widely dismissed today as too slow to be useful. But thanks to a curious, numbers-savvy engineering student and his professor, it may soon get a new lease on life.
"Big data" is playing an increasingly big role in the renewable energy industry and the transformation of the nation's electrical grid, and no single entity provides a better tool for such data than the Energy Department's Energy Systems Integration Facility (ESIF) located on the campus of the National Renewable Energy Laboratory (NREL).
- Page 1