As scientific computing moves inexorably toward the Exascale era, an increasingly urgent problem has emerged: many HPC software applications — both public domain and proprietary commercial — are hamstrung by antiquated algorithms and software unable to function in manycore supercomputing environments. Aside from developing an Exascale-level architecture, HPC code modernization is the most important challenge facing the HPC community.
Recently, the Harvard-Smithsonian Center for Astrophysics unveiled an unprecedented simulation of the universe’s development. Called the Illustris project, the simulation depicts more than 13 billion years of cosmic evolution across a cube of the universe that’s 350-million-light-years on each side. But why was it important to conduct such a simulation?
The Research Data Alliance seeks to build the social and technical bridges that enable open sharing and reuse of data, so as to address cross-border and cross-disciplinary challenges faced by researchers. This September, the RDA will be hosting its Fourth Plenary Meeting. Ahead of the event, iSGTW spoke to Gary Berg-Cross, general secretary of the Spatial Ontology Community of Practice and a member of the US advisory committee for RDA.
Albert Einstein's work laid down the foundation for modern quantum mechanics. His analysis of the “spookiness” of quantum mechanics opened up a whole range of applications, including quantum teleportation and quantum cryptography, but he wasn’t completely convinced by the theory of quantum mechanics — and that story is as fascinating as the theory he attempted to nail down. Quantum mechanics is downright bizarre...
Like a Formula One race car stuck in a traffic jam, HPC hardware performance is frequently hampered by HPC software. This is because some of the most widely used application codes have not been updated for years, if ever, leaving them unable to leverage advances in parallel systems. As hardware power moves toward exascale, the imbalance between hardware and software will only get worse. The problem of updating essential scientific ...
A recent United States Pharmacopoeia (USP) stimulus to the revision process paper1 has taken a life cycle approach to the development, validation and use of analytical procedures. Do chromatography data systems (CDS) have adequate functions to help analytical scientists meet these requirements when the pharmacopoeia is updated?
The lack of a holistic data management environment to support virtualization has left project managers in a haze about how best to address the needs of the business. The sky is beginning to clear somewhat with recent introductions from companies such as Accelrys, Core Informatics and PerkinElmer. Those products, along with CDD, will be discussed to highlight capabilities and vendor approaches.
A complicated decision: To purchase infrastructure or run remotely in the cloud? Bandwidth and data security issues provide the easiest gating factors to evaluate, because an inability to access data kills any chance of using remote infrastructure, be it the public cloud or at a remote HPC center. If running remotely is an option, then the challenge lies in determining the return on investment (ROI) for the remote and local options ...
I can most simply describe this book by quoting from the back cover: Motivation — “…how can you get started in a wide-ranging, interdisciplinary field that’s so clouded in hype?” Background Needed — “If you’re familiar with linear algebra, probability, and statistics, and have programming experience…”
IDC initiated the HPC Innovation Excellence Award program in 2011 to recognize innovative achievements using high performance computing (HPC). While there are multiple benchmarks to measure the performance of technical computers, there have been few formats available to evaluate the economic and scientific value HPC systems contribute. The HPC Innovation Excellence Award Program is designed to help close that gap.
This year’s International Supercomputing Conference, (ISC’14) in Leipzig, Germany, is now just one month away. iSGTW speaks to Niko Neufeld ahead of his talk at the event, ‘The Boson in the Haystack,’ which will take place during the session on ‘Emerging Trends for Big Data in HPC’ on Wednesday, June 25.
Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG). The purpose of the EE HPC WG is to drive implementation of energy conservation measures and energy efficient design in HPC. At ISC’14, Bates will chair the session titled Breaking Paradigms to Meet the Power Challenges...
Karlheinz Meier, professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics, will deliver a keynote talk at the International Supercomputing Conference 2014 (ISC’14). The theme for this talk will be ‘Brain-derived computing beyond Von Neumann — achievements and challenges’. Meier is one of the co-directors of Europe’s Human Brain Project (HBP), where he will be leading a research group
Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics, will chair a session on high-performance computing (HPC) in the life sciences at ISC’14 in Leipzig, Germany, in June. Peitsch is also a professor of bioinformatics at the University of Basel in Switzerland and is vice president of biological systems research at Philip Morris International.
iSGTW speaks to Derek Groen, a post-doctoral researcher from the Centre for Computational Science at University College London (UCL), UK. He’ll be presenting his work into the optimization of hemodynamics simulation code at ISC’14, and he tells iSGTW why the event is not to be missed by early-career researchers.