A new mapping tool makes preparing for natural disasters and responding to their aftermath easier than ever. Researchers from the A*STAR Institute of High Performance Computing in Singapore have developed a computer model that analyzes networks of interconnected roads to predict the speediest routes for rescuers to take using real-time data uploaded by aid workers on the ground.
For decades, neuroscientists have been trying to design computer networks that can mimic visual...
Turbulent combustion simulations, which provide input to the design of more fuel-efficient...
Not long ago, it would have taken several years to run a high-resolution simulation on a global...
Just in time for Christmas, Richard Zhang reveals how to print a 3-D Christmas tree efficiently and with zero material waste, using the world’s first algorithm for automatically decomposing a 3-D object into what are called pyramidal parts. The algorithm promises to become a big deal in the world of 3-D printing, and also has applications for designing molds and for casting.
Scientists have developed an ultrafast quantum chemical method, which allows rapid and accurate simulations of complex molecular systems consisting of thousands of molecules.
An international competition using the wisdom of crowds has developed computer algorithms to detect, predict, and ultimately prevent epileptic seizures. A total of five-hundred and four teams competed in two challenges, one for Seizure Detection and a second for Seizure Prediction.
Biological engineers have created a new computer model that allows them to design the most complex three-dimensional DNA shapes ever produced, including rings, bowls, and geometric structures such as icosahedrons that resemble viral particles.
El Niño is not a contemporary phenomenon; it’s long been the Earth’s dominant source of year-to-year climate fluctuation. But as the climate warms and the feedbacks that drive the cycle change, researchers want to know how El Niño will respond.
In the latest issue of HPC Source, “A New Dawn: Bringing HPC to the Enterprise,” we look at how small- to-medium-sized manufacturers can realize major benefits from adoption of high performance computing in areas such as modeling, simulation and analysis.
Researchers have published the first research to use computational modeling to predict and identify the metabolic products of gastrointestinal (GI) tract microorganisms.
In an unexpected mashup of financial and mechanical engineering, researchers have discovered that the same modeling used to forecast fluctuations in the stock market can be used to predict aspects of animal behavior.
Researchers from NCSU conducted innovative research that will allow better prediction of thermal hydraulic behavior for current and future nuclear reactor designs.
Researchers used HPC resources were utilized to run and visualize a breakthrough simulation involving a long-track EF5 tornado embedded within a supercell.
To heat magnetically confined plasmas to the millions of degrees needed for fusion reactions, scientists inject megawatts of electromagnetic energy from carefully engineered radiofrequency antennas. The generated electromagnetic waves interact with the plasma in complex ways.
Devavrat Shah’s group at MIT’s Laboratory for Information and Decision Systems (LIDS) specializes in analyzing how social networks process information. Next month, at the Conference on Neural Information Processing Systems, they’ll present a paper that applies their model to the recommendation engines that are familiar from websites like Amazon and Netflix — with surprising results.
A decade of close scrutiny has shed much more light on the technical computing needs of small and medium enterprises (SMEs), but they are still shrouded in partial darkness. That’s hardly surprising for a diverse global group with millions of members ranging from automotive suppliers and shotgun genomics labs to corner newsstands and strip mall nail salons.
The predictive analytics landscape covers a wide variety of techniques and methods designed to derive insights from data. These techniques have been used successfully for many years on structured data. In recent times, the volume and variety of data available for analysis has exploded, and most of this data is in non-traditional forms.
Welcome to SCIENTIFIC COMPUTING's "Bringing HPC to Smaller Manufacturers" edition of HPC Source, an interactive publication devoted exclusively to coverage of high performance computing.
A reliable way of predicting the flow of traffic could be a great convenience for commuters, as well as a significant energy-saver. During an emergency evacuation following a natural disaster, reliable predictions of the best routes could even be a lifesaver. Now a team of researchers from MIT, the University of Notre Dame, and elsewhere has devised what they say is an effective and relatively simple formula for making such predictions.
COMSOL Multiphysics 5.0 is an interactive environment for modeling and simulating scientific and engineering problems. Any COMSOL Multiphysics model can be turned into an application with its own interface using the tools provided with the Application Builder desktop environment.
Researchers studying iron-based superconductors are combining novel electronic structure algorithms with the high-performance computing power of the Department of Energy’s Titan supercomputer at Oak Ridge National Laboratory to predict spin dynamics, or the ways electrons orient and correlate their spins in a material.
How does a normal cellular process derail and become unhealthy? A multi-institutional, international team led by Virginia Tech researchers studied cells found in breast and other types of connective tissue and discovered new information about cell transitions that take place during wound healing and cancer.
The Council on Competitiveness has released a new report that explores the value of government leadership in supercomputing for industrial competitiveness, titled Solve. The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry. As the federal government pursues exascale computing to achieve national security and science missions, Solve examines how U.S.-based companies also benefit from leading-edge computation
Complex biochemical signals that coordinate fast and slow changes in neuronal networks keep the brain in balance during learning, according to scientists who report on a six-year quest by a collaborative team from the three institutions to solve a decades-old question and open the door to a more general understanding of how the brain learns and consolidates new experiences on dramatically different timescales.
The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.
During the 1930s, North America endured the Dust Bowl, a prolonged era of dryness that withered crops and dramatically altered where the population settled. Land-based precipitation records from the years leading up to the Dust Bowl are consistent with the telltale drying-out period associated with a persistent dry weather pattern, but they can’t explain why the drought was so pronounced and long-lasting.
To improve the modeling and reading of the branches on the human tree of life, researchers compiled the most comprehensive DNA set to date, a new treasure trove of 146 ancient (including Neanderthal and Denisovian) and modern human full mitochondrial genomes (amongst a set of 320 available worldwide).
As climate change grips the Arctic, how much carbon is leaving its thawing soil and adding to Earth's greenhouse effect? The question has long been debated by scientists. A new study conducted as part of NASA's Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) shows just how much work still needs to be done to reach a conclusion on this and other basic questions about the region where global warming is hitting hardest.
- Page 1