Subscribe to Scientific Computing Articles
Dr. Thomas Sterling holds the position of Professor of Informatics and Computing at the Indiana University (IU) School of Informatics and Computing, as well as serves as Chief Scientist and Executive Associate Director of the Center for Research in Extrem

A Quantum Leap in Computing, Maybe

April 20, 2015 12:07 pm | by Thomas Sterling, Indiana University | Comments

Quantum Computing has been a concept since the 1980s that has remained outside the domain of real-world HPC. Through the era of Moore’s Law and exponential progress in feature size, clock rates and resulting performance, the need for alternative paradigms and technologies has attracted little interest. But there has remained a curiosity among a limited community that has driven slow but persistent advances in associated ideas.

Trish Meek is Director of Product Strategy at Thermo Fisher Scientific.

The Case for User-Friendly Informatics in the Pharmaceutical QA/QC Lab

April 20, 2015 9:32 am | by Trish Meek, Thermo Fisher Scientific | Comments

All the computing power in the world isn’t useful if the software designed to access it is poorly designed. And we’re all much more discerning about user interfaces and usability: we expect our laboratory software to behave as intuitively as our smartphones. After all, laboratory employees are unlikely to be preoccupied with lines of codes and processors — they’re focused more on how easy the software is to use.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Opening Up Performance with OpenSpeedShop an Open Source Profiler

April 17, 2015 12:12 pm | by Rob Farber | Comments

There are a number of excellent commercial performance analysis tools on the market. Their big drawback is that they cost money. As a result, acquisition of commercial performance analysis software falls through the cracks, as most funding agencies discourage or prohibit the use of grant money for infrastructure improvements, and few grant authors are willing to take money away from research. Open-source tools are able to fill this gap.

The book assumes no formal statistical training on the part of the reader so the language is everyday plain. It seeks to clarify basic concepts and NOT teach the intricacies of the mathematics. Still, the book has much to recommend it.

Statistics Done Wrong: The Woefully Complete Guide

April 8, 2015 3:05 pm | by John A. Wass, Ph.D. | Comments

This delightful and informative guide from my friends at No Starch Press comes with the following cover blurb: “Statistics Done Wrong is a pithy, essential guide to statistical blunders in modern Science that will show you how to keep your research blunder-free.” It is somewhat pithy, but as to blunder free, I will quote the old maxim that “nothing is foolproof, as fools are so very clever.” Still, the book has much to recommend it.

“The things I do for my housemates' downloading habit…” Maths by Sergey Nivens

How a Long-dead Mathematician called Maxwell can Speed up your Internet

March 30, 2015 1:48 pm | by Jason Cole, Imperial College London | Comments

Electromagnetic radiation – it might sound like something that you’d be better off avoiding, but electromagnetic waves of various kinds underpin our senses and how we interact with the world – from the light emissions through which your eyes perceive these words, to the microwaves that carry the Wi-Fi signal to your laptop or phone on which you’re reading it.

The Living Heart Project’s goal is to enable creation of a customized 3-D heart.

Highly Realistic Human Heart Simulations Transforming Medical Care

March 26, 2015 5:03 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

The World Health Organization reports that cardiovascular diseases are the number one cause of death globally. Working to address this imperative public health problem, researchers world-wide are seeking new ways to accelerate research, raise the accuracy of diagnoses and improve patient outcomes. Several initiatives have utilized ground-breaking new simulations to advance research into aspects such as rhythm disturbances and ...

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Optimizing Application Energy Efficiency Using CPUs, GPUs and FPGAs

March 13, 2015 8:43 am | by Rob Farber | Comments

The HPC and enterprise communities are experiencing a paradigm shift as FLOPs per watt, rather than FLOPs (floating-point operations per second), are becoming the guiding metric in procurements, system design, and now application development. In short, “performance at any cost” is no longer viable, as the operational costs of supercomputer clusters are now on par with the acquisition cost of the hardware itself.

NWChem molecular modeling software takes full advantage of a wide range of parallel computing systems, including Cascade. Courtesy of PNNL

PNNL Shifts Computational Chemistry into Overdrive

February 25, 2015 8:29 am | by Karol Kowalski, Ph.D., and Edoardo Apra, Ph.D. | Comments

We computational chemists are an impatient lot. Despite the fact that we routinely deal with highly complicated chemical processes running on our laboratory’s equally complex HPC clusters, we want answers in minutes or hours, not days, months or even years. In many instances, that’s just not feasible; in fact, there are times when the magnitude of the problem simply exceeds the capabilities of the HPC resources available to us.

Another myth is that scientists look like this. U.S. Army RDECOM/Flickr, CC BY-SA

Seven Myths about Scientists Debunked

February 19, 2015 2:07 pm | by Jeffrey Craig and Marguerite Evans-Galea, Murdoch Childrens Research Institute | Comments

As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.

Not the Red Planet but Utah, one of the more Mars-like areas on Earth.

Mars is the Next Step for Humanity – We Must Take It

February 18, 2015 9:40 am | by Ashley Dove-Jay, University of Bristol | Comments

Elon Musk has built a US$12 billion company in an endeavour to pave the way to Mars for humanity. He insists that Mars is a “long-term insurance policy” for “the light of consciousness” in the face of climate change, extinction events, and our recklessness with technology. On the other hand, astronaut Chris Hadfield is sceptical: “Humanity is not going extinct,” he told me.

Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source

Helping to Save Lives of Critically Ill Children

February 12, 2015 10:17 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.

John Wass is a statistician based in Chicago, IL.

Explorations of Mathematical Models in Biology with Maple

February 10, 2015 9:29 am | by John A. Wass, Ph.D. | Comments

The author of this wonderful text delivers a brief, easy-to-absorb, yet very comprehensive text on modeling real-world data with Maple. Maple is software for performing mathematics, with a none-too-steep learning curve. In the introduction, the author is quick to point out that this is neither a detailed textbook of mathematical modeling, nor Maple. It is, however, a very well-written manual of introductory modeling and use of Maple.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Using Profile Information for Optimization, Energy Savings and Procurements

February 9, 2015 12:11 pm | by Rob Farber | Comments

Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution

The DSCOVR Mission's NISTAR — the  NIST Advanced Radiometer — will measure the Earth’s radiation budget, or whether our planet’s atmosphere is retaining more or less solar energy than it radiates back to space. Courtesy of NASA/DSCOVR

A Measurement Job That’s Truly Out of this World

February 9, 2015 10:23 am | by Chad Boutin, NIST | Comments

Quick, think of a four-letter name beginning with “N” for a federal agency involved in space science. Though NASA or NOAA would rightfully pop into mind first, crossword puzzle aficionados should know that NIST would be a correct answer as well — because the National Institute of Standards and Technology has been an integral part of readying technology for blastoff for decades.

John Wass is a statistician based in Chicago, IL.

Book Review: Applied Bayesian Modelling, 2nd Edition

January 13, 2015 8:59 am | by John A. Wass, Ph.D. | Comments

This is not a text for the novice. However, for those math/statistics aficionados, there is much to be had. The book’s great strength lies in two areas: the first is Peter Congdon’s generation of an excellent bibliography of the most modern techniques available, and the other is his (slightly) more straightforward explanations of the strengths and weaknesses of these techniques and suggestions for optimizing the results.



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.