Subscribe to Scientific Computing Articles
This still from a KIPAC visualization shows a jet of energy and particles streaming from a black hole. (Visualization: Ralf Kaehler / Simulation: Jonathan McKinney, Alexander Tchekhovskoy, and Roger Blandford)

Dramatically Intricate 3-D Universes Tell Important Stories about the Cosmos

August 21, 2014 3:16 pm | by Kelen Tuttle, Kavli Foundation | Comments

Recently, the Harvard-Smithsonian Center for Astrophysics unveiled an unprecedented simulation of the universe’s development. Called the Illustris project, the simulation depicts more than 13 billion years of cosmic evolution across a cube of the universe that’s 350-million-light-years on each side. But why was it important to conduct such a simulation?

“There are just so many reasons why data sharing is important,” says Gary Berg-Cross, general secretary of the Spatial Ontology Community of Practice and a member of the US advisory committee for RDA.

Laying the Foundations for Better Sharing of Research Data

August 14, 2014 2:57 pm | by Andrew Purcell | Comments

The Research Data Alliance seeks to build the social and technical bridges that enable open sharing and reuse of data, so as to address cross-border and cross-disciplinary challenges faced by researchers. This September, the RDA will be hosting its Fourth Plenary Meeting. Ahead of the event, iSGTW spoke to Gary Berg-Cross, general secretary of the Spatial Ontology Community of Practice and a member of the US advisory committee for RDA.

It’s mind-blowing stuff, but Einstein wasn’t completely convinced by quantum mechanics. Courtesy of Travis Morgan, CC BY-NC-ND

Einstein vs Quantum Mechanics ... and Why He'd be a Convert Today

June 17, 2014 10:20 am | by Margaret Reid, Swinburne University of Technology | Comments

Albert Einstein's work laid down the foundation for modern quantum mechanics. His analysis of the “spookiness” of quantum mechanics opened up a whole range of applications, including quantum teleportation and quantum cryptography, but he wasn’t completely convinced by the theory of quantum mechanics — and that story is as fascinating as the theory he attempted to nail down. Quantum mechanics is downright bizarre...

High-resolution CESM simulation run on Yellowstone. This featured CAM-5 spectral element at roughly 0.25deg grid spacing, and POP2 on a nominal 0.1deg grid.

Building Momentum for Code Modernization: The Intel Parallel Computing Centers

June 9, 2014 12:06 pm | by Doug Black | Comments

Like a Formula One race car stuck in a traffic jam, HPC hardware performance is frequently hampered by HPC software. This is because some of the most widely used application codes have not been updated for years, if ever, leaving them unable to leverage advances in parallel systems. As hardware power moves toward exascale, the imbalance between hardware and software will only get worse. The problem of updating essential scientific ...

Do chromatography data systems (CDS) have adequate functions to help analytical scientists meet requirements when the pharmacopoeia is updated?

New CDS Functions to Meet New Regulatory Requirements?

June 5, 2014 4:17 pm | by R.D. McDowall | Comments

A recent United States Pharmacopoeia (USP) stimulus to the revision process paper1 has taken a life cycle approach to the development, validation and use of analytical procedures. Do chromatography data systems (CDS) have adequate functions to help analytical scientists meet these requirements when the pharmacopoeia is updated?

Michael H. Elliot, CEO of Atrium Research & Consulting

Collaboration in the Cloud: Research Virtualization is Accelerating Market Evolution

June 5, 2014 12:39 pm | by Michael H. Elliott | Comments

The lack of a holistic data management environment to support virtualization has left project managers in a haze about how best to address the needs of the business. The sky is beginning to clear somewhat with recent introductions from companies such as Accelrys, Core Informatics and PerkinElmer. Those products, along with CDD, will be discussed to highlight capabilities and vendor approaches.

Hybrid cloud configurations provide the flexibility and cost benefits of the public cloud with the bandwidth, security and control of the private cloud.

Assessing Cloud ROI for HPC and Enterprise

June 5, 2014 12:26 pm | by Rob Farber | Comments

A complicated decision: To purchase infrastructure or run remotely in the cloud? Bandwidth and data security issues provide the easiest gating factors to evaluate, because an inability to access data kills any chance of using remote infrastructure, be it the public cloud or at a remote HPC center. If running remotely is an option, then the challenge lies in determining the return on investment (ROI) for the remote and local options ...

In summarizing this interesting book, it does have many useful hints, tips and tricks to addressing specific types of problems, as well as pitfalls. I would appreciate far more scientific examples than the business ones that were in abundance.

Doing Data Science: Straight Talk from the Front Line

June 5, 2014 11:53 am | by John A. Wass, Ph.D. | Comments

I can most simply describe this book by quoting from the back cover: Motivation — “…how can you get started in a wide-ranging, interdisciplinary field that’s so clouded in hype?” Background Needed — “If you’re familiar with linear algebra, probability, and statistics, and have programming experience…”

Precipitation plot from the Twentieth Century Reanalysis Project showing data from 1900 when a ferocious hurricane hit the Galveston, TX, area. One of the first HPC Innovation Excellence Award Winners, NERSC’s international study enabled a much more detai

Recognizing ROI and Innovative Application of HPC

June 3, 2014 4:39 pm | by Chirag Dekate, Ph.D. | Comments

IDC initiated the HPC Innovation Excellence Award program in 2011 to recognize innovative achievements using high performance computing (HPC). While there are multiple benchmarks to measure the performance of technical computers, there have been few formats available to evaluate the economic and scientific value HPC systems contribute. The HPC Innovation Excellence Award Program is designed to help close that gap.

Event display of a proton–lead event in the LHCb detector, showing the reconstructed tracks of particles produced in the collision. The proton beam travels from left to right. Image courtesy CERN.

Handling Big Data to Understand Antimatter at CERN’s LHCb Experiment

May 21, 2014 2:11 pm | by Andrew Purcell | Comments

This year’s International Supercomputing Conference, (ISC’14) in Leipzig, Germany, is now just one month away. iSGTW speaks to Niko Neufeld ahead of his talk at the event, ‘The Boson in the Haystack,’ which will take place during the session on ‘Emerging Trends for Big Data in HPC’ on Wednesday, June 25.

Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG).

Meeting the Power Challenge: Natalie Bates on Creating more Energy-efficient HPC

May 8, 2014 4:39 pm | by ISC | Comments

Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG). The purpose of the EE HPC WG is to drive implementation of energy conservation measures and energy efficient design in HPC. At ISC’14, Bates will chair the session titled Breaking Paradigms to Meet the Power Challenges...

"We do not make predictions about the scientific outcomes of the simulation experiments, but we promise to build collaborative tools that will enable very exciting science," says Meier. Courtesy of F. Hentschel, Heidelberg University

Brain-derived Computing beyond Von Neumann

April 18, 2014 3:30 pm | by Nages Sieslack | Comments

Karlheinz Meier, professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics, will deliver a keynote talk at the International Supercomputing Conference 2014 (ISC’14). The theme for this talk will be ‘Brain-derived computing beyond Von Neumann —  achievements and challenges’. Meier is one of the co-directors of Europe’s Human Brain Project (HBP), where he will be leading a research group

Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics

Exciting Advances: Growth of HPC in the Life Sciences

April 18, 2014 3:12 pm | by Andrew Purcell | Comments

Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics, will chair a session on high-performance computing (HPC) in the life sciences at ISC’14 in Leipzig, Germany, in June. Peitsch is also a professor of bioinformatics at the University of Basel in Switzerland and is vice president of biological systems research at Philip Morris International. 

Derek Groen, a post-doctoral researcher from the Centre for Computational Scienceat University College London (UCL), UK

Blood Flow in the Brain, Multi-Scale Modeling and More: Life as an Early-career HPC Researcher

April 18, 2014 2:53 pm | by Andrew Purcell | Comments

iSGTW speaks to Derek Groen, a post-doctoral researcher from the Centre for Computational Science at University College London (UCL), UK. He’ll be presenting his work into the optimization of hemodynamics simulation code at ISC’14, and he tells iSGTW why the event is not to be missed by early-career researchers.

 Dr. Rupak Biswas, Deputy Director of Exploration Technology at NASA Ames

NASA’s Rupak Biswas Sees Usable Quantum Computing before End of Decade

April 17, 2014 2:45 pm | by ISC | Comments

Quantum computing is a technology that promises to revolutionize the IT industry. Thus far, though, it has been unable to shake its perception as a sort of permanent “technology of the future.”  But, with the availability of quantum annealing computers from D-Wave, that perception might be changing. One of the first D-Wave systems has been deployed at NASA Ames Research Center, where researchers have been busy putting the machine ...



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.