Computer simulations have predicted a new phase of matter, an atomically thin two-dimensional liquid. This prediction pushes the boundaries of possible phases of materials further than ever before. Two-dimensional materials themselves were considered impossible until the discovery of graphene around 10 years ago. However, they have been observed only in the solid phase, because the thermal atomic motion required for molten materials...
President Barack Obama said May 20, 2015, the threat posed by climate change is evident all...
Dassault Systèmes announced that the first heart model from its “Living Heart Project” will be...
A new initiative designed to advance how scientists digitally reconstruct and analyze individual...
Evolutionary biologists and computer scientists have come together study the evolution of pop music. Their analysis of 17,000 songs from the US Billboard Hot 100 charts, 1960 to 2010, is the most substantial scientific study of the history of popular music to date. They studied trends in style, the diversity of the charts, and the timing of musical revolutions.
For more than a decade, mathematicians and computational scientists have been collaborating with earth scientists to break new ground in modeling complex flows in energy and oil and gas applications. Their work has yielded a high-performance computational fluid dynamics and reactive transport code dubbed Chombo-Crunch that could enhance efforts to develop carbon sequestration as a way to address Earth’s growing carbon dioxide challenges.
Computer simulation of the body’s inflammatory response to traumatic injury accurately replicated known individual outcomes and predicted population results. Researchers examined blood samples from 33 survivors of car or motorcycle accidents or falls for multiple markers of inflammation, including interleukin-6, and segregated the patients into categories of trauma severity. They were able to validate model predictions.
The German Climate Computing Center is managing the world's largest climate simulation data archive, used by climate researchers worldwide. The archive consists of more than 40 petabytes of data and is projected to grow by roughly 75 petabytes annually over the next five years. As climate simulations are carried out on increasingly powerful supercomputers, massive amounts of data are produced that must be effectively stored and analyzed.
A research team has demonstrated a predictive modeling capability that can help accelerate the discovery of new materials to improve biofuel and petroleum production. The findings present a tool that could lead to more efficient processes in the biofuel and petrochemical industries, while reducing the time and cost of associated laboratory research and development efforts. The materials of interest are called zeolites.
NASA is bringing together experts spanning a variety of scientific fields for an unprecedented initiative dedicated to the search for life on planets outside our solar system. The Nexus for Exoplanet System Science, or “NExSS,” hopes to better understand the various components of an exoplanet, as well as how the planet stars and neighbor planets interact to support life.
If you find yourself sweating out a day that is monstrously hot, chances are you can blame humanity. A new report links three out of four such days to man's effects on climate. And as climate change worsens around mid-century, that percentage of extremely hot days being caused by man-made greenhouse gases will push past 95 percent.
Researchers have invented a new technology that can increase the bandwidth of WiFi systems by 10 times, using LED lights to transmit information. The technology could be integrated with existing WiFi systems to reduce bandwidth problems in crowded locations, such as airport terminals or coffee shops, and in homes where several people have multiple WiFi devices.
How is lightning initiated in thunderclouds? This is difficult to answer — how do you measure electric fields inside large, dangerously charged clouds? It was discovered, more or less by coincidence, that cosmic rays provide suitable probes to measure electric fields within thunderclouds. The measurements, including the strength of the electric field at a certain height in the cloud, were performed with the LOFAR radio telescope.
The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.
COMSOL 5.1 is a major upgrade that delivers new and enhanced functionality across all products, including COMSOL Multiphysics and the Application Builder and COMSOL Server, as well as the add-on modules. Among the significant updates are enhancements to numerous core modeling and simulation capabilities and an improved user experience for application design.
A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.
While our understanding of how the aurora's shimmering curtains of color are formed, scientists have struggled to explain the black patches between the bright beams. Now Swedish and British scientists have discovered what happens at the heart of these so-called "black aurora."
Simply declaring a region as a nature protection area is not enough, regular monitoring of its ecological condition is also necessary. New methods are being developed to monitor Europe’s vast number of nature protection areas from the air. Short laser pulses are sent to the ground, and information on the status of the habitat can be deduced from the reflected light signals using elaborate computer algorithms.
Most recent advances in artificial intelligence are the result of machine learning, in which computers are turned loose on huge data sets to look for patterns. To make machine-learning applications easier to build, computer scientists have begun developing so-called probabilistic programming languages, which let researchers mix and match machine-learning techniques that have worked well in other contexts.
Researchers at the University of Zurich have unveiled new technology enabling drones to recover stable flight from any position and land autonomously in failure situations. It will even be possible to launch drones by simply tossing them into the air like a baseball or recover stable flight after a system failure. Drones will be safer and smarter, with the ability to identify safe landing sites and land automatically when necessary.
Study Combines Field Experiments, Predictive Modeling to Look at How Forests Worldwide Respond to Climate ChangeApril 7, 2015 5:09 pm | by Oak Ridge National Laboratory | News | Comments
Researchers from the Department of Energy’s Oak Ridge National Laboratory will play key roles in an expansive new project that aims to bring the future of tropical forests and the climate system into much clearer focus by coupling field research with the development of a new ecosystem model.
Researchers have developed a new computational model that effectively simulates the mechanical behavior of biofilms. Their model may lead to new strategies for studying a range of issues from blood clots to waste treatment systems. The new model may be adapted to study clot formation in blood vessels, which can pose the risk of detaching and migrating to the lungs, a fatal event.
The Weather Company Migrates Data Services to IBM Cloud, Plans to Advance Internet of Things SolutionsMarch 31, 2015 1:43 pm | by IBM | News | Comments
IBM and The Weather Company have announced a global strategic alliance to integrate real-time weather insights into business to improve operational performance and decision-making. As part of the alliance, The Weather Company, including its global B2B division WSI, will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.
New global forest maps combine citizen science with multiple data sources, for an unprecedented level of accuracy about the location and extent of forestland worldwide. The maps rely on a combination of recent multisensory remote sensing data, statistics and crowdsourcing. By combining different data sources, and incorporating the input of trained citizen scientists, researchers were able to produce maps more accurate than any existing...
MOVIA Big Data Analytics Platform is designed to help organizations watch for important patterns in their data and generate instant alerts to users or other systems. The software enables improved prediction of trends through advanced data modeling that captures situational context, so decisions are not ‘made in a vacuum.’
Jürgen Kohler studied Aerospace Engineering at the University of Stuttgart. In 1992 he started his career at the Mercedes Benz AG and became Manager of Crash-Simulation in 1997. From 2001 to 2005 he was Senior Manager for CAE Passive Safety, Durability and NVH, from 2006 to 2010 for CAE Passive Safety, Durability and Stiffness CAE and Test
The World Health Organization reports that cardiovascular diseases are the number one cause of death globally. Working to address this imperative public health problem, researchers world-wide are seeking new ways to accelerate research, raise the accuracy of diagnoses and improve patient outcomes. Several initiatives have utilized ground-breaking new simulations to advance research into aspects such as rhythm disturbances and ...
Wolfram SystemModeler 4.1 is a modeling and simulation environment for cyber-physical systems used in industries such as aerospace, automotive, pharmaceuticals, systems biology and electrical engineering. Key features include integration of Mathematica's complete suite for reliability analysis; import from tools such as Simulink, Flowmaster and IBM Rational Rhapsody enabled based on the FMI standard; and import of subsystems.
Los Alamos National Laboratory mechanical and thermal engineering researchers’ efforts to solve the complex problem of how ocean currents affect the infrastructure of floating oil rigs and their computational fluid dynamics (CFD) numerical simulations received recognition from ANSYS, a company that provides computer-based engineering simulation capabilities.
- Page 1