A team of computer scientists working to improve how researchers across the sciences empower big data to solve problems have been awarded $5 million by the National Science Foundation. The team will address one of the leading challenges in tackling some of the world’s most pressing issues in science: the ability to analyze and compute large amounts of data.
Scientists have used computer simulations to show how bacteria are able to destroy antibiotics...
Computationally intensive research in Sweden will soon get a boost from the fastest academic...
IBM announced that Caris Life Sciences is using IBM technical computing and storage technology...
Stampede used to Perform Modeling to Advance Potential Drug Targets for Alzheimer's, Parkinson's, SchizophreniaSeptember 23, 2014 3:59 pm | by Jorge Salazar, TACC | News | Comments
It all begins in the brain as a flood, tens of millions of neurotransmitters handed off from one neuron to another in just a fraction of a second. Memories, dreams and learning share a common thread in this exchange of electrical and chemical signals by the nearly 100 billion spindly neurons of the brain, each cell networked to 10,000 others.
SDSC Joins Intel Parallel Computing Centers Program with Focus on Molecular Dynamics, Neuroscience and Life SciencesSeptember 12, 2014 2:44 pm | by San Diego Supercomputer Center | News | Comments
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, is working with semiconductor chipmaker Intel to further optimize research software to improve the parallelism, efficiency, and scalability of widely used molecular and neurological simulation technologies.
Every trillionth of a second, Panagiotis Grammatikopoulos calculates the location of each individual atom in a particle based on where it is and which forces apply. He uses a computer program to make the calculations, and then animates the motion of the atoms using visualization software. The resulting animation illuminates what happens, atom-by-atom, when two nanoparticles collide.
Using Powerful GPU-Based Monte Carlo Simulation Engine to Model Larger Systems, Reduce Data Errors, Improve System PrototypingJuly 22, 2014 8:33 am | by Jeffrey Potoff and Loren Schwiebert | Blogs | Comments
Recently, our research work got a shot in the arm because Wayne State University was the recipient of a complete high-performance compute cluster donated by Silicon Mechanics as part of its 3rd Annual Research Cluster Grant competition. The new HPC cluster gives us some state-of-the-art hardware, which will enhance the development of what we’ve been working on — a novel GPU-Optimized Monte Carlo simulation engine for molecular systems.
Scientists at the Department of Energy’s SLAC National Accelerator Laboratory have made the first structural observations of liquid water at temperatures down to minus 51 degrees Fahrenheit, within an elusive “no man’s land” where water’s strange properties are super-amplified.
Dr. Collins is the director of the Advanced Biomedical Computing Center at the Frederick National Laboratory for Cancer Research. Dr. Collins’ research focuses on biomedical computing applications pertaining to cancer. His research group develops and applies high-performance algorithms to solve data-intensive computational biology problems in the areas of genomic analysis, pattern recognition in proteomics and imaging, molecular modeling, and systems biology.
Accelrys Insight and Accelrys Insight for Excel are designed to enhance scientific data analysis with capabilities that include the ability to run database searches directly from the Excel spreadsheet environment. The Web-based life science, discovery and innovation support environment speeds decisions by simplifying access to complex hierarchical data and implementing data-rich tooltips for scatterplots...
At this year's International Supercomputing Conference, Professor Klaus Schulten will deliver the opening keynote address on computing in biomedicine and bioengineering. Schulten, a physicist by training, now devotes his time to computation biophysics. He has contributed to several key discoveries in this area, has garnered numerous awards and honors for his work, and is considered one of preeminent leaders in field.
Astronomers at the University of Washington have developed a new method of gauging the atmospheric pressure of exoplanets, or worlds beyond the solar system, by looking for a certain type of molecule. And if there is life out in space, scientists may one day use this same technique to detect its biosignature — the telltale chemical signs of its presence — in the atmosphere of an alien world.
An international team of scientists has, for the first time, revealed the color scheme of an extinct marine animal using fossilized skin pigment from three multi-million-year old marine reptiles. Previously, scientists could only guess what colors huge reptiles, such as mosasaurs and ichthyosaurs had
The National Energy Research Scientific Computing (NERSC) Center recently accepted “Edison,” a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) on February 5, 2014, and scientists are already reporting results.
Dassault Systèmes, a 3-D design software, 3-D Digital Mock Up and Product Lifecycle Management (PLM) solutions provider, and Accelrys, a provider of scientific innovation lifecycle management software for chemistry, biology and materials, have announced the signing of a definitive merger agreement for Dassault Systèmes to acquire San Diego-based Accelrys, Inc.
eQUEUE is designed to be an intuitive Web-based front-end job submission tool and management portal that increases cluster utilization by making it easier to run jobs from any Web browser. It has the added value of virtually eliminating errors through pre-defined job submission scripts.
Three U.S.-based scientists won this year's Nobel Prize in chemistry on October 9, 1013, for developing powerful computer models that any researcher can use to understand complex chemical interactions and create new drugs. Research in the 1970s by Martin Karplus, Michael Levitt and Arieh Warshel has led to programs that unveil chemical processes such as how exhaust fumes are purified or how photosynthesis takes place in green leaves
The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies
Schrödinger and DeltaSoft jointly announced that they have entered into a strategic partnership that will allow Schrödinger to enhance its current Enterprise Informatics offerings with DeltaSoft's flagship product ChemCart and associated tool base.
The Pistoia Alliance has released the HELM biomolecular representation standard software toolkit and editor under the permissive open source MIT license. HELM (Hierarchical Editing Language for Macromolecules) enables the representation of a wide range of biomolecules whose size and complexity render existing small-molecule and sequence-based informatics methodologies impractical or unusable.
Scientist at Charité – Universitätsmedizin Berlin have used a computer simulation for predicting the effectiveness of various combination therapies for colon tumors. In most tumors, the communication between the individual cells is disturbed and the cells permanently receive growth and survival signals. For this reason, drugs are increasingly used in modern tumor therapy that targets those molecules to shut down these faulty signals.
Core Informatics, a provider of data management solutions to the life sciences, molecular diagnostics and energy industries, and OpenEye Scientific Software, a developer of innovative molecular modeling and cheminformatics solutions for molecular discovery, have announced a new partnership and the integration of OpenEye cheminformatics software into Core Informatics’ web-based Core LIMS and Core ELN.
A National Science Foundation (NSF)-supported world-class supercomputer named Stampede, which has already enabled research teams to predict where and when earthquakes may strike, how much sea levels could rise, and how fast brain tumors grow- was officially dedicated on March 27, 2013.
Irvine, Calif., March 25, 2013 — UC Irvine neurobiologists have found a novel molecular mechanism that helps trigger the formation of long-term memory. The researchers believe the discovery of this mechanism adds another piece to the puzzle in the ongoing effort to uncover the mysteries of memory and, potentially, certain intellectual disabilities.
The flip of a single molecular switch helps create the mature neuronal connections that allow the brain to bridge the gap between adolescent impressionability and adult stability. Now Yale School of Medicine researchers have reversed the process, recreating a youthful brain that facilitated both learning and healing in the adult mouse.
Vanderbilt University researchers have combined small-angle X-ray and neutron scattering with dynamic molecular modeling to determine how the structure of RPA responds as it engages DNA
Cycle Computing announced it has ended its record-breaking 2012 by winning the IDC HPC Innovation Excellence Award. IDC recognized Cycle’s 50,000-core utility supercomputer run in the Amazon Web Services (AWS) cloud for pharmaceutical companies Schrödinger and Nimbus Discovery. The unprecedented cluster completed 12.5 processor years in less than three hours with a cost of less than $4,900 per hour to facilitate computational drug discovery
Detailed studies of one of the best-performing organic photovoltaic materials reveal an unusual bilayer lamellar structure that may help explain the material’s superior performance at converting sunlight to electricity and guide the synthesis of new materials with even better properties.
- Page 1