In the latest issue of HPC Source, “A New Dawn: Bringing HPC to the Enterprise,” we look at how small- to-medium-sized manufacturers can realize major benefits from adoption of high performance computing in areas such as modeling, simulation and analysis.
Researchers have published the first research to use computational modeling to predict and...
In an unexpected mashup of financial and mechanical engineering, researchers have...
Researchers from NCSU conducted innovative research that will allow better prediction of thermal...
Researchers used HPC resources were utilized to run and visualize a breakthrough simulation involving a long-track EF5 tornado embedded within a supercell.
To heat magnetically confined plasmas to the millions of degrees needed for fusion reactions, scientists inject megawatts of electromagnetic energy from carefully engineered radiofrequency antennas. The generated electromagnetic waves interact with the plasma in complex ways.
Devavrat Shah’s group at MIT’s Laboratory for Information and Decision Systems (LIDS) specializes in analyzing how social networks process information. Next month, at the Conference on Neural Information Processing Systems, they’ll present a paper that applies their model to the recommendation engines that are familiar from websites like Amazon and Netflix — with surprising results.
A decade of close scrutiny has shed much more light on the technical computing needs of small and medium enterprises (SMEs), but they are still shrouded in partial darkness. That’s hardly surprising for a diverse global group with millions of members ranging from automotive suppliers and shotgun genomics labs to corner newsstands and strip mall nail salons.
The predictive analytics landscape covers a wide variety of techniques and methods designed to derive insights from data. These techniques have been used successfully for many years on structured data. In recent times, the volume and variety of data available for analysis has exploded, and most of this data is in non-traditional forms.
Welcome to SCIENTIFIC COMPUTING's "Bringing HPC to Smaller Manufacturers" edition of HPC Source, an interactive publication devoted exclusively to coverage of high performance computing.
A reliable way of predicting the flow of traffic could be a great convenience for commuters, as well as a significant energy-saver. During an emergency evacuation following a natural disaster, reliable predictions of the best routes could even be a lifesaver. Now a team of researchers from MIT, the University of Notre Dame, and elsewhere has devised what they say is an effective and relatively simple formula for making such predictions.
COMSOL Multiphysics 5.0 is an interactive environment for modeling and simulating scientific and engineering problems. Any COMSOL Multiphysics model can be turned into an application with its own interface using the tools provided with the Application Builder desktop environment.
Researchers studying iron-based superconductors are combining novel electronic structure algorithms with the high-performance computing power of the Department of Energy’s Titan supercomputer at Oak Ridge National Laboratory to predict spin dynamics, or the ways electrons orient and correlate their spins in a material.
How does a normal cellular process derail and become unhealthy? A multi-institutional, international team led by Virginia Tech researchers studied cells found in breast and other types of connective tissue and discovered new information about cell transitions that take place during wound healing and cancer.
The Council on Competitiveness has released a new report that explores the value of government leadership in supercomputing for industrial competitiveness, titled Solve. The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry. As the federal government pursues exascale computing to achieve national security and science missions, Solve examines how U.S.-based companies also benefit from leading-edge computation
Complex biochemical signals that coordinate fast and slow changes in neuronal networks keep the brain in balance during learning, according to scientists who report on a six-year quest by a collaborative team from the three institutions to solve a decades-old question and open the door to a more general understanding of how the brain learns and consolidates new experiences on dramatically different timescales.
The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.
During the 1930s, North America endured the Dust Bowl, a prolonged era of dryness that withered crops and dramatically altered where the population settled. Land-based precipitation records from the years leading up to the Dust Bowl are consistent with the telltale drying-out period associated with a persistent dry weather pattern, but they can’t explain why the drought was so pronounced and long-lasting.
To improve the modeling and reading of the branches on the human tree of life, researchers compiled the most comprehensive DNA set to date, a new treasure trove of 146 ancient (including Neanderthal and Denisovian) and modern human full mitochondrial genomes (amongst a set of 320 available worldwide).
As climate change grips the Arctic, how much carbon is leaving its thawing soil and adding to Earth's greenhouse effect? The question has long been debated by scientists. A new study conducted as part of NASA's Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) shows just how much work still needs to be done to reach a conclusion on this and other basic questions about the region where global warming is hitting hardest.
Technology is putting complex topics like severe weather and climate change on the map — literally. Mapping data associated with specific geographic locations is a powerful way to glean new and improved knowledge from data collections and to explain the results to policymakers and the public. Particularly useful is the ability to layer different kinds of geospatial data on top of one another and see how they interact.
The next time some nasty storms are heading your way, the National Weather Service says it will have a better forecast of just how close they could come to you. The weather service started using a new high-resolution computer model that officials say will dramatically improve forecasts for storms up to 15 hours in advance. It should better pinpoint where and when tornadoes, thunderstorms and blizzards are expected.
The National Oceanic and Atmospheric Administration (NOAA) and NASA are funding three demonstration projects that will lay the foundation for the first national network to monitor marine biodiversity at scales ranging from microbes to whales. The U.S. Department of the Interior's Bureau of Ocean Energy Management (BOEM) also plans to contribute.
In popular culture, mathematics is often deemed inaccessible or esoteric. Yet in the modern world, it plays an ever more important role in our daily lives and a decisive role in the discovery and development of new ideas — often behind the scenes.
Mathematicians have introduced a new element of uncertainty into an equation used to describe the behavior of fluid flows. While being as certain as possible is generally the stock and trade of mathematics, the researchers hope this new formulation might ultimately lead to mathematical models that better reflect the inherent uncertainties of the natural world.
Researchers have developed a scaling law that predicts a human’s risk of brain injury, based on previous studies of blasts’ effects on animal brains. The method may help the military develop more protective helmets, as well as aid clinicians in diagnosing traumatic brain injury — often referred to as the “invisible wounds” of battle.
Strong solar flares can bring down communications and power grids on Earth. By demonstrating how these gigantic eruptions are caused, physicists are laying the foundations for future predictions. The shorter the interval between two explosions in the solar atmosphere, the more likely it is that the second flare will be stronger than the first one.
With President Obama announcing climate-support initiatives at the 2014 United Nations Climate Summit, the U.S. Department of Energy national laboratories are teaming with academia and the private sector to develop the most advanced climate and Earth system computer model yet created. For Los Alamos National Laboratory researchers, it is a welcome advance for an already vibrant high-performance computing community.
A computer model that accurately predicts how composite materials behave when damaged will make it easier to design lighter, more fuel-efficient aircraft. Innovative computer codes form the basis of a computer model that shows in unprecedented detail how an aircraft's composite wing, for instance, would behave if it suffered small-scale damage, such as a bird strike.
- Page 1