Unless some groundbreaking solutions are forthcoming, exascale computing may remain more fancy than fa ct John Kirkley When the Defense Advanced Research Projects Agency (DARPA) issued the report “Exascale Computing Study: Technology Challenges in Achieving Exascale Systems”1 on September 28, 2008, it sent shock waves through the high performance computing (HPC) community. The report flatly stated that current technology trends were “insufficient” to achieve exascale-level systems in the next five to 10 years. The biggest stumbling block? Power.
Using immersion cooling to reach the next level of power density and efficiency Phil E. Tuma Progress in leadership-class computing is being hindered by the limitations of conventional air cooling technology. Multicore chip architectures, faster memory and increases in parallelism have meant an increase in the amount of computational power that must be devoted to communication. While evolving technologies such as 3-D packaging, low-loss materials and improved Z-axis and optical interconnect will play an important role in increasing off-chip and inter-node bandwidth, decreasing signal path length through increased packaging density remains a tried-and-true strategy.
A composite look at four laws, descisions and quidelines related to pharmaceutical data Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. In the U.S. pharmaceutical industries, the collection, storage, mining and analysis of data are subject to a number of disjointed, uncoordinated and occasionally contradictory regulatory restrictions. Pharmaceutical data falls into two general categories, each with differing regulatory oversight and guidelines. In the developmental process, the clinical data that describes tests of product safety and efficacy falls under the purview of the U.S. Food and Drug Administration (FDA).
Today’s technology will improve tomorrow’s computer memory Mike May Although advances in floating point operations per second (FLOPS) often take center stage in high performance computing, faster computation cannot keep forging ahead without equally improved data-storage capabilities. The question is: What technology will spawn tomorrow’s best memory?
A storage system modeled after Google’s BigTable has the edge in data management for next generation Internet and cloud computing users, claim researchers at the University of Texas – Pan American (UTPA) in Edinburg. In tests designed to find the best storage technologies for Web 3.0 — also known as the Semantic Web — Apache’s Hadoop database, HBase, out-performed MySQL Cluster
International Data Corporation (IDC) announced the first recipients of the new HPC Innovation Excellence Award at the ISC’11 International Supercomputing Conference Steve Conway The HPC Innovation Excellence Award recognizes noteworthy achievements by users of high performance computing (HPC) technologies.
Navigating a Sea of Options Michael H. Elliott In an increasingly electronic R&D world, data must be stored securely for privacy, intellectual property protection, quality, regulatory, and for competitive reasons. As organizations move from controlled paper notebooks to an open and collaborative ELN work environment, there are record management risks that must be addressed. Valuable intellectual property can be subject to theft, and databases are susceptible to data-altering malware and hackers. An organization must have consistent, audited and proven record management practices that are enforced across the entire spectrum of their R&D operations.
Nova Biologicals implements an integrated water, environmental and pharma LIMS/DMS Paul Pearce, Ph.D., Colin Thurston Nova Biologicals is a full-service, National Environmental Laboratory Accreditation Conference (NELAC)-accredited laboratory in Texas, providing testing and consulting services to the water, medical device, pharmaceutical, nutraceutical and food industries globally. Water testing makes up 53 percent of Nova’s total revenue, and the laboratory specializes in microbiological, chemical and toxicological testing of drinking water and wastewater samples. A team of dedicated scientists provides comprehensive diagnostic testing of specimens for the presence of infectious disease organisms and water testing under the Federal Safe Drinking Water Act.
Encouraging computerized medical device invention Sandy Weinberg, Ph.D., Ronald Fuqua, Ph.D. A patient swallows a computerized capsule, providing his physician with a series of images of the gastrointestinal tract. Another patient accesses the computer control on her wheelchair, which raises her to a standing position and follows a carefully designed exercise program to keep her legs from atrophying. A computerized “lab on a chip” provides toxicologists with a complete analysis series from a single sample. These and other computerized medical devices have two important characteristics in common: they are all innovations developed by entrepreneurs in a single country, and they represent the success stories of that country’s policies for supporting and encouraging innovation.
An extremely flexible tool for acquiring, processing and displaying data John R. Joyce, Ph.D. Origin 8.5.1 is a full-featured data analysis and graphing package that has been the target of previous reviews in Scientific Computing. Here, we will take a more in-depth look at the many automation features to be found in both Origin and OriginPro. These include several different ways of automating its internal processes, ways for it to control external processes and ways for external processes to control it. In the following text, we will examine these capabilities and some of the ways in which they can be used, as well as constructing examples illustrating some of the approaches to go about doing it.
Interaction of two computing modeling fields provides critical disease mitigation tools Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. While neither may qualify as “the world’s oldest profession,” at least in risqué jokes, both professional actuaries and epidemiologists have long histories, with an interesting modern intersection. The Old Testament describes a variety of diseases in great detail: arguably Moses was, in addition to his other skills, an effective epidemiologist. And, as for actuaries, didn’t Noah count off the animals in his ark two-by-two? However, it is in the modern analysis of health trends that these two modern professions emerge to provide a scientific basis for research and application.
Systems Integration Bennett Lass Ph.D., PMP Web Exclusive This is the fifth article in a series on best practices in Electronic Lab Notebook (ELN) implementation. This article discusses the fourth core area: System Integration.
Be Careful What You Wish For Steve Conway, IDC Research VP, HPC In 1995, the global market for high performance computing (HPC) servers, a.k.a. supercomputers, was worth about $2 billion. By 2010, that figure nearly quintupled to $9.5 billion, thanks to the rise of HPC clusters based on commercial, off-the-shelf (COTS) technologies.
Everyone’s a winner in the race for a common application language that can support both x86 and massively parallel hardware Rob Farber Commercial and research projects must now have parallel applications to compete for customer and research dollars. This translates into pressure on software development efforts that have to control costs while supporting a range of rapidly evolving parallel hardware platforms. What is needed is a common programming language that developers can use to create parallel applications with a single source tree that can run on current and future parallel hardware.
Proper selection requires careful review of needs and processes John R. Joyce, Ph.D. While it is common for users in various laboratories and industries to feel that their processes are unique, in many ways, they all have common needs. Similarly, in many respects, all laboratory information management systems (LIMS) are alike, or at least they should be. All must perform basic functions, such as track the users entering data, track the samples arriving at the laboratory and their processing through it, and generate analysis reports, while maintaining data integrity throughout the whole process.