In Stephen Leacock’s nonsense story, “Gertrude theGoverness,” the hero, in extremis, “… flung himself upon his horse and rode madly off in all directions.” A fitting description for the state of power and cooling in today’s high performance computing industry. Researchers and engineers at companies, government agencies and educational institutions worldwide are exploring a wide variety of solutions to problems posed by petascale systems ...
Ahhh! There is nothing like a tall, cool drink of water when thirsty. Not surprisingly, computers also prefer liquid refreshment as opposed to air cooling when hot. The choice for the technologist resides in when to make the move to liquid cooling and in what type of liquid cooling system is most appropriate.
Fifteen years ago, power and cooling didn’t make the top 10 list of issues HPC data centers were facing. That changed quickly with the rise to dominance of clusters and other highly parallel computer architectures, starting in the period 2000 to 2001 and escalating from there. In IDC’s worldwide surveys since 2006, power and cooling have consistently ranked as the number two concern for HPC data centers
Here’s the pitch: “We would like millions of dollars to build a supercomputer capable of calculating 150 trillion floating point operations per second (TFLOPS). Hundreds of scientists will use the system to investigate the causes of global warming, drugs that may cure cancer, and the origins of the universe. The machine will be built from the most advanced equipment available from NEC, Intel, NVIDIA, Mellanox, and other manufacturers...
The market for electronic laboratory notebook software (ELN) continued its upward growth trend in 2013, though at a slower rate than in previous years. While software sales still experienced a healthy increase north of five percent, it was not the robust 20 to 30 percent experienced in years past. Product sales are estimated at $130 million, while an additional $100 million was expended on services, support and maintenance.
At this year's International Supercomputing Conference, Professor Klaus Schulten will deliver the opening keynote address on computing in biomedicine and bioengineering. Schulten, a physicist by training, now devotes his time to computation biophysics. He has contributed to several key discoveries in this area, has garnered numerous awards and honors for his work, and is considered one of preeminent leaders in field.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
This month’s review is a bit off of the usual track, e.g. statistical, mathematical and genomics software. However, it does include much pertinent information for chemists, chemical engineers and biologists. SciFinder is a search engine for chemistry and biology references for just about anything that can be accurately described in the search feature.
Welcome to Informatics Snapshot — a feature that highlights the standout properties of the current crop of laboratory informatics systems. While not intended to be a full formal review of the featured product or to indicate whether the product is considered “good” or “bad,” its purpose is to present some of the “diamonds and rust,” as the Joan Baez song goes. In this article, we’ll take a brief look at the LabX system
Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
Encryption and nuclear weapons are two easily recognized examples where a combinatorial explosion is a sought after characteristic. In the software development world, combinatorial explosions are bad. In particular, it is far too easy to become lost in the minutia of writing code that can run efficiently on NVIDIA GPUs, AMD GPUs, x86, ARM and Intel Xeon Phi while also addressing the numerous compiler and user interface vagaries
Data Integrity in a Nutshell: Industry must take bold steps to assure the data used for drug quality decisions is trustworthyJanuary 7, 2014 12:31 pm | by Mark E. Newton | Comments
Regulatory inspectors have started digging much deeper into data, no longer accepting batch release data and supportive testing at face value. Even worse, this effort is justified: they have cited a number of firms for violations of data integrity, a most fundamental bond of trust between manufacturers and the regulators that inspect them. Industry must take bold steps to assure the data used for drug quality decisions is trustworthy...
Data integrity is a current hot topic with regulatory agencies, as seen with recent publications in this magazine, and audit trails are an important aspect of ensuring this in computerized systems. The purpose of this article is to compare and contrast the EU and FDA GMP regulatory requirements for computerized system audit trails.
One of the challenges in laboratory data management is the handling and exchange of experiment data. Many vendors provide excellent instruments, but most produce data in their own proprietary formats. This leads to major difficulties for data processing, collaboration, instrument integration and archiving. The ASTM AnIML standardization effort addresses these problems by providing a neutral XML-based format for exchanging scientific data.
Mobile technology is where the money is right now in computer technology. Current leadership class supercomputers are “wowing” the HPC world with petaflop/s performance through the combined use of several thousand GPUs or Intel Xeon Phi coprocessors, but in reality the sale of a few thousand of these devices is insignificant when compared against the 1.5 billon cellphone processors and 190 million tablet processors ...