Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics, will chair a session on high-performance computing (HPC) in the life sciences at ISC’14 in Leipzig, Germany, in June. Peitsch is also a professor of bioinformatics at the University of Basel in Switzerland and is vice president of biological systems research at Philip Morris International.
On the first anniversary of President Obama's BRAIN Initiative announcement, the National...
A new DARPA technology office will merge biology, engineering and computer science to harness...
Is Big Data really the biggest challenge at the moment for translational science? Certainly...
Is Big Data really the biggest challenge at the moment for translational science? Certainly there are issues with the complexity and size of omics data, which Big Data techniques can help address, but there are two more pressing challenges: enabling collaboration whilst facilitating information sharing, and the ability to better interpret multiple different omics data (multi-omics).
Crops that produce more while using less water seem like a dream for a world with a burgeoning population and already strained food and water resources. This dream is coming closer to reality for University of Illinois at Urbana-Champaign researchers who have developed a new computer model that can help plant scientists breed better soybean crops.
Flowering plants attract pollinating insects with scent from their flowers and bright colors. If they have become infested with herbivores like caterpillars, they attract beneficial insects like parasitic wasps with the help of scent signals from their leaves. The wasps then lay their eggs in the caterpillars and kill the parasites.
In 2007, MIT scientists developed a type of microscopy that allowed them to detail the interior of a living cell in three dimensions, without adding any fluorescent markers or other labels. This technique also revealed key properties, such as the cells’ density.
Computational Biophysicist Klaus Schulten to Speak on Large-Scale Computing in Biomedicine and BioengineeringMarch 20, 2014 3:16 pm | by ISC | News | Comments
Dr. Klaus Schulten, a leading computational biophysicist and professor of physics at the University of Illinois at Urbana-Champaign will discuss “Large-Scale Computing in Biomedicine and Bioengineering” as the opening keynote address at the 2014 International Supercomputing Conference. ISC’14 will be held June 22-26 in Leipzig, Germany.
Scientists have revived a moss plant that was frozen beneath the Antarctic ice and seemingly lifeless since the days of Attila the Hun. Dug up from Antarctica, the simple moss was about 1,600 years old, black and looked dead. But when it was thawed in a British lab's incubator, something happened. It grew again.
At this year's International Supercomputing Conference, Professor Klaus Schulten will deliver the opening keynote address on computing in biomedicine and bioengineering. Schulten, a physicist by training, now devotes his time to computation biophysics. He has contributed to several key discoveries in this area, has garnered numerous awards and honors for his work, and is considered one of preeminent leaders in field.
On August 3, 2004, NASA’s Mercury Surface, Space Environment, Geochemistry, and Ranging (MESSENGER) spacecraft began a seven-year journey, spiraling through the inner solar system to Mercury. One year after launch, the spacecraft zipped around Earth, getting an orbit correction from Earth’s gravity and getting a chance to test its instruments by observing its home planet.
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
This month’s review is a bit off of the usual track, e.g. statistical, mathematical and genomics software. However, it does include much pertinent information for chemists, chemical engineers and biologists. SciFinder is a search engine for chemistry and biology references for just about anything that can be accurately described in the search feature.
A new type of giant virus called "Pithovirus" has been discovered in the frozen ground of extreme north-eastern Siberia. Buried underground, this giant virus, which is harmless to humans and animals, has survived being frozen for more than 30,000 years. It is the largest virus ever discovered.
The Department of Energy’s National Energy Research Scientific Computing Center (NERSC) announced the winners of its second annual High Performance Computing (HPC) Achievement Awards on February 4, 2014, during the annual NERSC User Group meeting at Lawrence Berkeley National Laboratory (Berkeley Lab).
Researchers have revealed ancient conditions that almost ended life on Earth, using a new technique they developed to hunt for mineral deposits. The first life developed in the ancient oceans around 3.6 billion years ago, but then life remained as little more than a layer of slime for a billion years. Suddenly, 550 million years ago, evolution burst back into action. So, what was the hold-up during those ‘boring billion’ years?
Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months. However, a team working with Beagle, one of the world's fastest supercomputers devoted to life sciences, reports that genome analysis can be radically accelerated. This computer is able to analyze 240 full genomes in about two days.
A large number of neglected diseases exist in which each disease has only a small number of patients in the world, yet the number is still significant. Kun-Yi Hsin is working on precisely this problem. In a recent article, he describes his work identifying potential drugs and targets for those drugs using a computational approach that has the potential to bring the cost of drug development down while increasing the speed of drug discovery.
XPRIZE has announced that team registration is open for the $2 million Wendy Schmidt Ocean Health XPRIZE, a competition to incentivize breakthroughs in ocean pH sensor technology that will radically transform our understanding of ocean acidification. Teams are expected to come from diverse backgrounds, ranging from nanotechnology and biotechnology to industrial chemistry and marine science
A Lawrence Livermore National Laboratory physicist and his colleagues have found a new application for the tools and mathematics typically used in physics to help solve problems in biology. Specifically, the team used statistical mechanics and mathematical modeling to shed light on something known as epigenetic memory — how an organism can create a biological memory of some variable condition, such as quality of nutrition or temperature.
ScienceCloud is an SaaS-based information management and collaboration workspace for externalized life science research and development. It is designed to advance collaborative drug discovery with a new generation of integrated applications built on a scalable, cloud-based scientific platform.
The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies, such as monoclonal antibodies.
Researchers of Freie Universität Berlin, of the Bernstein Fokus Neuronal Basis of Learning, and of the Bernstein Center Berlin and have developed a robot that perceives environmental stimuli and learns to react to them. The scientists used the relatively simple nervous system of the honeybee as a model for its working principles. To this end, they installed a camera on a small robotic vehicle and connected it to a computer.
The National Energy Research Scientific Computing (NERSC) Center recently accepted “Edison,” a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) on February 5, 2014, and scientists are already reporting results.
The Center for Biotechnology (CeBiTec) at Bielefeld University has added TimeLogic’s latest J-series Field Programmable Gate Array (FPGA) hardware to their computational tools platform. TimeLogic’s DeCypher systems are designed to greatly increase the speed of sequence comparison by combining custom FPGA circuitry with optimized implementations of BLAST, Smith-Waterman, Hidden Markov Model and gene modeling algorithms.
"Life is a DNA software system" says J. Craig Venter, PhD, and biology can be digitized, the information sent via the Internet, and viruses and other life forms recreated using the emerging tools of synthetic biology. Dr. Venter describes his vision for applying biological teleportation to send digitized biological information around the world and from Mars to Earth in an interview in Industrial...
From April 1 – 4, the International Trade Fair for Laboratory Technology, Analysis and Biotechnology will be a center for key players in science and industry. This year’s analytica will revolve around three main themes — i.e. food analysis, plastics analysis and genetic and bioanalysis—whether in the exhibition, the Live Labs and the program of related events.
- Page 1