Subscribe to Scientific Computing Articles
Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Optimizing Application Energy Efficiency Using CPUs, GPUs and FPGAs

March 13, 2015 8:43 am | by Rob Farber | Comments

The HPC and enterprise communities are experiencing a paradigm shift as FLOPs per watt, rather than FLOPs (floating-point operations per second), are becoming the guiding metric in procurements, system design, and now application development. In short, “performance at any cost” is no longer viable, as the operational costs of supercomputer clusters are now on par with the acquisition cost of the hardware itself.

NWChem molecular modeling software takes full advantage of a wide range of parallel computing systems, including Cascade. Courtesy of PNNL

PNNL Shifts Computational Chemistry into Overdrive

February 25, 2015 8:29 am | by Karol Kowalski, Ph.D., and Edoardo Apra, Ph.D. | Comments

We computational chemists are an impatient lot. Despite the fact that we routinely deal with highly complicated chemical processes running on our laboratory’s equally complex HPC clusters, we want answers in minutes or hours, not days, months or even years. In many instances, that’s just not feasible; in fact, there are times when the magnitude of the problem simply exceeds the capabilities of the HPC resources available to us.

Another myth is that scientists look like this. U.S. Army RDECOM/Flickr, CC BY-SA

Seven Myths about Scientists Debunked

February 19, 2015 2:07 pm | by Jeffrey Craig and Marguerite Evans-Galea, Murdoch Childrens Research Institute | Comments

As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.

Not the Red Planet but Utah, one of the more Mars-like areas on Earth.

Mars is the Next Step for Humanity – We Must Take It

February 18, 2015 9:40 am | by Ashley Dove-Jay, University of Bristol | Comments

Elon Musk has built a US$12 billion company in an endeavour to pave the way to Mars for humanity. He insists that Mars is a “long-term insurance policy” for “the light of consciousness” in the face of climate change, extinction events, and our recklessness with technology. On the other hand, astronaut Chris Hadfield is sceptical: “Humanity is not going extinct,” he told me.

Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source

Helping to Save Lives of Critically Ill Children

February 12, 2015 10:17 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.

John Wass is a statistician based in Chicago, IL.

Explorations of Mathematical Models in Biology with Maple

February 10, 2015 9:29 am | by John A. Wass, Ph.D. | Comments

The author of this wonderful text delivers a brief, easy-to-absorb, yet very comprehensive text on modeling real-world data with Maple. Maple is software for performing mathematics, with a none-too-steep learning curve. In the introduction, the author is quick to point out that this is neither a detailed textbook of mathematical modeling, nor Maple. It is, however, a very well-written manual of introductory modeling and use of Maple.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Using Profile Information for Optimization, Energy Savings and Procurements

February 9, 2015 12:11 pm | by Rob Farber | Comments

Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution

The DSCOVR Mission's NISTAR — the  NIST Advanced Radiometer — will measure the Earth’s radiation budget, or whether our planet’s atmosphere is retaining more or less solar energy than it radiates back to space. Courtesy of NASA/DSCOVR

A Measurement Job That’s Truly Out of this World

February 9, 2015 10:23 am | by Chad Boutin, NIST | Comments

Quick, think of a four-letter name beginning with “N” for a federal agency involved in space science. Though NASA or NOAA would rightfully pop into mind first, crossword puzzle aficionados should know that NIST would be a correct answer as well — because the National Institute of Standards and Technology has been an integral part of readying technology for blastoff for decades.

John Wass is a statistician based in Chicago, IL.

Book Review: Applied Bayesian Modelling, 2nd Edition

January 13, 2015 8:59 am | by John A. Wass, Ph.D. | Comments

This is not a text for the novice. However, for those math/statistics aficionados, there is much to be had. The book’s great strength lies in two areas: the first is Peter Congdon’s generation of an excellent bibliography of the most modern techniques available, and the other is his (slightly) more straightforward explanations of the strengths and weaknesses of these techniques and suggestions for optimizing the results.

John Wass is a statistician based in Chicago, IL

Qlucore Omics Explorer: Analysis in an Instant

January 8, 2015 4:00 pm | by John A. Wass, Ph.D. | Comments

Qlucore is a software platform for the analysis of genomics, proteomics and related data. As with most statistical and genomics software, it generates an immediate graphic for most analyses. Its specific areas of use include gene expression, protein arrays, DNA methylation, miRNA, proteomics, and pattern and structure identification in multivariate data.

Peter Boogaard is the founder of Industrial Lab Automation and chairman of the Paperless Lab Academy.

2015 Promises Major Milestones and Demands for Change

January 7, 2015 12:28 pm | by Peter J. Boogaard | Comments

Despite the fact that industries won’t change working processes unless there is a mandatory need to do so, major milestones are expected in 2015 in the battle to adopt data and standardization in our scientific community. The need for deployment of these integration standards to enable efficient sharing of knowledge across our internal and external partners is re-enforced by regulatory bodies.

Artist’s impression of a proton depicting three interacting valence quarks inside. Courtesy of Jefferson Lab

HPC Community Experts Weigh in on Code Modernization

December 17, 2014 4:33 pm | by Doug Black | Comments

Sense of urgency and economic impact emphasized: The “hardware first” ethic is changing. Hardware retains the glamour, but there is now the stark realization that the newest parallel supercomputers will not realize their full potential without reengineering the software code to efficiently divide computational problems among the thousands of processors that comprise next-generation many-core computing platforms.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Today’s Enterprising GPUs

November 20, 2014 2:09 pm | by Rob Farber | Comments

HPC has always embraced the leading edge of technology and, as such, acts as the trailbreaker and scout for enterprise and business customers. HPC has highlighted and matured the abilities of previously risky devices, like GPUs, that enterprise customers now leverage to create competitive advantage. GPUs have moved beyond “devices with potential” to “production devices” that are used for profit generation.

John Wass is a statistician based in Chicago, IL.

Exploration and Analysis of DNA Microarray and Other High-Dimensional Data

November 18, 2014 3:10 pm | by John A. Wass, Ph.D. | Comments

The introduction of newer sequencing methodologies, DNA microarrays and high-throughput technology has resulted in a deluge of large data sets that require new strategies to clean, normalize and analyze the data. All of these and more are covered in approximately 300 pages with extraordinary clarity and minimal mathematics.

Karol Kowalski, Capability Lead for NWChem Development, works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.

Advancing Computational Chemistry with NWChem

November 18, 2014 3:07 pm | by Mike Bernhardt, HPC Community Evangelist, Intel | Comments

An interview with PNNL’s Karol Kowalski, Capability Lead for NWChem Development - NWChem is an open source high performance computational chemistry tool developed for the Department of Energy at Pacific Northwest National Lab in Richland, WA. I recently visited with Karol Kowalski, Capability Lead for NWChem Development, who works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.