The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.
NeuroSolutions Infinity predictive data analytics and modeling software is designed to...
Robo Brain — a large-scale computational system that learns from publicly available Internet...
A wealth of images of Earth at night taken by astronauts on the International Space Station (ISS...
University of Wisconsin Researchers utilized HPC resources in combination with multiple advanced forms of protein structure prediction algorithms and deep sequence data mining to construct a highly plausible capsid model for Rhinovirus-C (~600,000 atoms). The simulation model helps researchers in explaining why the existing pharmaceuticals don’t work on this virus.
Making changes within a complex software system is often error-prone – even the smallest mistake can endanger the entire system. Ten years ago, computer scientists from Saarbrücken around Professor Andreas Zeller developed a technique that automatically issues suggestions on how to manage changes...
Athens, Ga. – New tools to collect and share information could help stem the loss of the world's threatened species, according to a paper published today in the journal Science. The study—by an international team of scientists that included John L. Gittleman, dean of the University of Georgia Odum...
Huynh Phung Huynh's research interests include high performance computing (HPC): compiler optimization for GPU, many cores and other accelerators; Parallel computing: framework for parallel programming or scheduling; and HPC for data mining and machine learning algorithms.
Computer technology that can mine data from social media during times of natural or other disaster could provide invaluable insights for rescue workers and decision makers. Advances in information technology have had a profound impact on disaster management.
The new HITS research group “Data Mining and Uncertainty Quantification” analyzes large amounts of data and calculates uncertainties in technical systems. With Prof. Vincent Heuveline as their group leader, the group of mathematicians and computer scientists especially focuses on increasing the security of technology in operating rooms.
Josiah Stamp said: “The individual source of the statistics may easily be the weakest link.” Nowhere is this more true than in the new field of text mining, given the wide variety of textual information. By some estimates, 80 percent of the information available occurs as free-form text which, prior to the development of text mining, needed to be read in its entirety in order for information to be obtained from it.
Researchers are developing computers capable of "approximate computing" to perform calculations good enough for certain tasks that don't require perfect accuracy, potentially doubling efficiency and reducing energy consumption.
Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.
In his 1937 book, "Think and Grow Rich," author Napoleon Hill identified 13 steps to success, one of which was the power of the mastermind. "No two minds ever come together without thereby creating a third, invisible, intangible force, which may be likened to a third mind," Hill wrote.
Recent announcements by Intel and NVIDIA indicate that massively parallel computing with GPUs and Intel Xeon Phi will no longer require passing data via the PCIe bus. The bad news is that these standalone devices are still in the design phase and are not yet available for purchase.
IBM announced on August 24, 2013, that it has added nine new academic collaborations to its more than 1,000 partnerships with universities across the globe, focusing on Big Data and analytics - all of which are designed to prepare students for the 4.4 million jobs that will be created worldwide to support Big Data by 2015. The company also announced more than $100,000 in awards for Big Data curricula.
The HPC market is entering a kind of perfect storm. For years, HPC architectures have tilted farther and farther away from optimal balance between processor speed, memory access and I/O speed. As successive generations of HPC systems have upped peak processor performance without corresponding advances in per-core memory capacity and speed, the systems have become increasingly compute centric
The 14th annual KDnuggets Software Poll, conducted in May 2013, attracted record participation of 1,880 internet voters, more than doubling the previous year's numbers. KDnuggets.com is a data mining portal and newsletter publisher for the data mining community with more than 12,000 subscribers.
The time may be fast approaching for researchers to take better advantage of the vast amount of valuable patient information available from U.S. electronic health records. Lian Duan, an NJIT computer scientist with an expertise in data mining, has done just that with the recent publication of "Adverse Drug Effect Detection," IEEE Journal of Biomedical and Health Informatics (March, 2013).
Pathway Studio, a research solution for biologists, is now available in a Web-based version. The integrated data mining and visualization software features comprehensive knowledge bases produced by applying MedScan, Elsevier’s proprietary text-mining technology, to a large corpus of biological literature.
When it officially came online at the San Diego Supercomputer Center (SDSC) in early January 2012, Gordon was instantly impressive. In one demonstration, it sustained more than 35 million input/output operations per second--then, a world record.
i3D Enterprise Service integrates storage, processing and data mining in an enterprise-level private cloud. Laboratory data can be automatically and securely uploaded from instruments to a private cloud and processed on the cloud, enabling workflow execution and data mining in a fraction of the time.
SampleManager 11 laboratory information management system (LIMS) features advanced tools that are designed to improve laboratory process mapping, management and automation. Users can build workflows to reflect their individual laboratory processes and take ownership of workflow management.
Today, we are more connected than ever. We live in an always-on world whose digital economy has made data a new form of resource that fundamentally changes our lives. But has this revolution really occurred across R&D domains? At a time when global R&D investment is over $1.5 trillion, leading voices still bemoan a lack of open access to decision-making data and an innovation deficit syndrome.
Art Levinson, Sergey Brin, Anne Wojcicki, Mark Zuckerberg, Priscilla Chan and Yuri Milner announced the launch of the Breakthrough Prize in Life Sciences, recognizing excellence in research aimed at curing intractable diseases and extending human life
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, is seeking innovative applications for the next round of user allocations on its data-intensive Gordon supercomputer, which went into operation earlier this year
Researchers at Columbia Engineering have developed new software that can simultaneously calculate the carbon footprints of thousands of products faster than ever before. “Our novel approach generates standard-compliant product carbon footprints for companies with large portfolios at a fraction of previously required time and expertise,”
Laboratories working in the pharmaceutical industry in the areas of R&D and quality control find themselves increasingly having to cope with conflicting demands — tougher regulatory requirements and harsher economic realities. In order to meet these demands, new ways of dealing with process, data and system management are necessary.
- Page 1