Subscribe to Scientific Computing Articles
Do chromatography data systems (CDS) have adequate functions to help analytical scientists meet requirements when the pharmacopoeia is updated?

New CDS Functions to Meet New Regulatory Requirements?

June 5, 2014 4:17 pm | by R.D. McDowall, Ph.D. | Comments

A recent United States Pharmacopoeia (USP) stimulus to the revision process paper1 has taken a life cycle approach to the development, validation and use of analytical procedures. Do chromatography data systems (CDS) have adequate functions to help analytical scientists meet these requirements when the pharmacopoeia is updated?

Michael H. Elliot, CEO of Atrium Research & Consulting

Collaboration in the Cloud: Research Virtualization is Accelerating Market Evolution

June 5, 2014 12:39 pm | by Michael H. Elliott | Comments

The lack of a holistic data management environment to support virtualization has left project managers in a haze about how best to address the needs of the business. The sky is beginning to clear somewhat with recent introductions from companies such as Accelrys, Core Informatics and PerkinElmer. Those products, along with CDD, will be discussed to highlight capabilities and vendor approaches.

Hybrid cloud configurations provide the flexibility and cost benefits of the public cloud with the bandwidth, security and control of the private cloud.

Assessing Cloud ROI for HPC and Enterprise

June 5, 2014 12:26 pm | by Rob Farber | Comments

A complicated decision: To purchase infrastructure or run remotely in the cloud? Bandwidth and data security issues provide the easiest gating factors to evaluate, because an inability to access data kills any chance of using remote infrastructure, be it the public cloud or at a remote HPC center. If running remotely is an option, then the challenge lies in determining the return on investment (ROI) for the remote and local options ...

In summarizing this interesting book, it does have many useful hints, tips and tricks to addressing specific types of problems, as well as pitfalls. I would appreciate far more scientific examples than the business ones that were in abundance.

Doing Data Science: Straight Talk from the Front Line

June 5, 2014 11:53 am | by John A. Wass, Ph.D. | Comments

I can most simply describe this book by quoting from the back cover: Motivation — “…how can you get started in a wide-ranging, interdisciplinary field that’s so clouded in hype?” Background Needed — “If you’re familiar with linear algebra, probability, and statistics, and have programming experience…”

Precipitation plot from the Twentieth Century Reanalysis Project showing data from 1900 when a ferocious hurricane hit the Galveston, TX, area. One of the first HPC Innovation Excellence Award Winners, NERSC’s international study enabled a much more detai

Recognizing ROI and Innovative Application of HPC

June 3, 2014 4:39 pm | by Chirag Dekate, Ph.D. | Comments

IDC initiated the HPC Innovation Excellence Award program in 2011 to recognize innovative achievements using high performance computing (HPC). While there are multiple benchmarks to measure the performance of technical computers, there have been few formats available to evaluate the economic and scientific value HPC systems contribute. The HPC Innovation Excellence Award Program is designed to help close that gap.

Event display of a proton–lead event in the LHCb detector, showing the reconstructed tracks of particles produced in the collision. The proton beam travels from left to right. Image courtesy CERN.

Handling Big Data to Understand Antimatter at CERN’s LHCb Experiment

May 21, 2014 2:11 pm | by Andrew Purcell | Comments

This year’s International Supercomputing Conference, (ISC’14) in Leipzig, Germany, is now just one month away. iSGTW speaks to Niko Neufeld ahead of his talk at the event, ‘The Boson in the Haystack,’ which will take place during the session on ‘Emerging Trends for Big Data in HPC’ on Wednesday, June 25.

Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG).

Meeting the Power Challenge: Natalie Bates on Creating more Energy-efficient HPC

May 8, 2014 4:39 pm | by ISC | Comments

Natalie Bates chairs the Energy Efficient High Performance Computing Working Group (EE HPC WG). The purpose of the EE HPC WG is to drive implementation of energy conservation measures and energy efficient design in HPC. At ISC’14, Bates will chair the session titled Breaking Paradigms to Meet the Power Challenges...

"We do not make predictions about the scientific outcomes of the simulation experiments, but we promise to build collaborative tools that will enable very exciting science," says Meier. Courtesy of F. Hentschel, Heidelberg University

Brain-derived Computing beyond Von Neumann

April 18, 2014 3:30 pm | by Nages Sieslack | Comments

Karlheinz Meier, professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics, will deliver a keynote talk at the International Supercomputing Conference 2014 (ISC’14). The theme for this talk will be ‘Brain-derived computing beyond Von Neumann —  achievements and challenges’. Meier is one of the co-directors of Europe’s Human Brain Project (HBP), where he will be leading a research group

Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics

Exciting Advances: Growth of HPC in the Life Sciences

April 18, 2014 3:12 pm | by Andrew Purcell | Comments

Manuel Peitsch, co-founder of the Swiss Institute of Bioinformatics, will chair a session on high-performance computing (HPC) in the life sciences at ISC’14 in Leipzig, Germany, in June. Peitsch is also a professor of bioinformatics at the University of Basel in Switzerland and is vice president of biological systems research at Philip Morris International. 

Derek Groen, a post-doctoral researcher from the Centre for Computational Scienceat University College London (UCL), UK

Blood Flow in the Brain, Multi-Scale Modeling and More: Life as an Early-career HPC Researcher

April 18, 2014 2:53 pm | by Andrew Purcell | Comments

iSGTW speaks to Derek Groen, a post-doctoral researcher from the Centre for Computational Science at University College London (UCL), UK. He’ll be presenting his work into the optimization of hemodynamics simulation code at ISC’14, and he tells iSGTW why the event is not to be missed by early-career researchers.

 Dr. Rupak Biswas, Deputy Director of Exploration Technology at NASA Ames

NASA’s Rupak Biswas Sees Usable Quantum Computing before End of Decade

April 17, 2014 2:45 pm | by ISC | Comments

Quantum computing is a technology that promises to revolutionize the IT industry. Thus far, though, it has been unable to shake its perception as a sort of permanent “technology of the future.”  But, with the availability of quantum annealing computers from D-Wave, that perception might be changing. One of the first D-Wave systems has been deployed at NASA Ames Research Center, where researchers have been busy putting the machine ...


HPC Power and Cooling Heat Up

April 4, 2014 11:02 am | by John Kirkley | Comments

In Stephen Leacock’s nonsense story, “Gertrude theGoverness,” the hero, in extremis, “… flung himself upon his horse and rode madly off in all directions.” A fitting description for the state of power and cooling in today’s high performance computing industry. Researchers and engineers at companies, government agencies and educational institutions worldwide are exploring a wide variety of solutions to problems posed by petascale systems ...


Is Your Computer Thirsty?

April 4, 2014 10:48 am | by Rob Farber | Comments

Ahhh! There is nothing like a tall, cool drink of water when thirsty. Not surprisingly, computers also prefer liquid refreshment as opposed to air cooling when hot. The choice for the technologist resides in when to make the move to liquid cooling and in what type of liquid cooling system is most appropriate.


Power and Cooling: The Sword of Damocles?

April 4, 2014 10:38 am | by Steve Conway | Comments

Fifteen years ago, power and cooling didn’t make the top 10 list of issues HPC data centers were facing. That changed quickly with the rise to dominance of clusters and other highly parallel computer architectures, starting in the period 2000 to 2001 and escalating from there. In IDC’s worldwide surveys since 2006, power and cooling have consistently ranked as the number two concern for HPC data centers


Is TSUBAME-KFC a Game-changer?

April 4, 2014 10:23 am | by Kirk W. Cameron, Ph.D. | Comments

Here’s the pitch: “We would like millions of dollars to build a supercomputer capable of calculating 150 trillion floating point operations per second (TFLOPS). Hundreds of scientists will use the system to investigate the causes of global warming, drugs that may cure cancer, and the origins of the universe. The machine will be built from the most advanced equipment available from NEC, Intel, NVIDIA, Mellanox, and other manufacturers...



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.