Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG). The purpose of the EE HPC WG is to drive implementation of energy conservation measures and energy efficient design in HPC. At ISC’14, Bates will chair the session titled Breaking Paradigms to Meet the Power Challenges...
Karlheinz Meier, professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics, will deliver a keynote talk at the International Supercomputing Conference 2014 (ISC’14). The theme for this talk will be ‘Brain-derived computing beyond Von Neumann — achievements and challenges’. Meier is one of the co-directors of Europe’s Human Brain Project (HBP), where he will be leading a research group
Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics, will chair a session on high-performance computing (HPC) in the life sciences at ISC’14 in Leipzig, Germany, in June. Peitsch is also a professor of bioinformatics at the University of Basel in Switzerland and is vice president of biological systems research at Philip Morris International.
iSGTW speaks to Derek Groen, a post-doctoral researcher from the Centre for Computational Science at University College London (UCL), UK. He’ll be presenting his work into the optimization of hemodynamics simulation code at ISC’14, and he tells iSGTW why the event is not to be missed by early-career researchers.
Quantum computing is a technology that promises to revolutionize the IT industry. Thus far, though, it has been unable to shake its perception as a sort of permanent “technology of the future.” But, with the availability of quantum annealing computers from D-Wave, that perception might be changing. One of the first D-Wave systems has been deployed at NASA Ames Research Center, where researchers have been busy putting the machine ...
In Stephen Leacock’s nonsense story, “Gertrude theGoverness,” the hero, in extremis, “… flung himself upon his horse and rode madly off in all directions.” A fitting description for the state of power and cooling in today’s high performance computing industry. Researchers and engineers at companies, government agencies and educational institutions worldwide are exploring a wide variety of solutions to problems posed by petascale systems ...
Ahhh! There is nothing like a tall, cool drink of water when thirsty. Not surprisingly, computers also prefer liquid refreshment as opposed to air cooling when hot. The choice for the technologist resides in when to make the move to liquid cooling and in what type of liquid cooling system is most appropriate.
Fifteen years ago, power and cooling didn’t make the top 10 list of issues HPC data centers were facing. That changed quickly with the rise to dominance of clusters and other highly parallel computer architectures, starting in the period 2000 to 2001 and escalating from there. In IDC’s worldwide surveys since 2006, power and cooling have consistently ranked as the number two concern for HPC data centers
Here’s the pitch: “We would like millions of dollars to build a supercomputer capable of calculating 150 trillion floating point operations per second (TFLOPS). Hundreds of scientists will use the system to investigate the causes of global warming, drugs that may cure cancer, and the origins of the universe. The machine will be built from the most advanced equipment available from NEC, Intel, NVIDIA, Mellanox, and other manufacturers...
The market for electronic laboratory notebook software (ELN) continued its upward growth trend in 2013, though at a slower rate than in previous years. While software sales still experienced a healthy increase north of five percent, it was not the robust 20 to 30 percent experienced in years past. Product sales are estimated at $130 million, while an additional $100 million was expended on services, support and maintenance.
At this year's International Supercomputing Conference, Professor Klaus Schulten will deliver the opening keynote address on computing in biomedicine and bioengineering. Schulten, a physicist by training, now devotes his time to computation biophysics. He has contributed to several key discoveries in this area, has garnered numerous awards and honors for his work, and is considered one of preeminent leaders in field.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
This month’s review is a bit off of the usual track, e.g. statistical, mathematical and genomics software. However, it does include much pertinent information for chemists, chemical engineers and biologists. SciFinder is a search engine for chemistry and biology references for just about anything that can be accurately described in the search feature.
Welcome to Informatics Snapshot — a feature that highlights the standout properties of the current crop of laboratory informatics systems. While not intended to be a full formal review of the featured product or to indicate whether the product is considered “good” or “bad,” its purpose is to present some of the “diamonds and rust,” as the Joan Baez song goes. In this article, we’ll take a brief look at the LabX system
Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.