Subscribe to Scientific Computing Articles
With big data, what you get out is what you put in.

Big Data Analyses Depend on Starting with Clean Data Points

August 20, 2015 12:02 pm | by H V Jagadish , University of Michigan | Comments

Popularly referred to as “Big Data,” mammoth sets of information about almost every aspect of our lives have triggered great excitement about what we can glean from analyzing these diverse data sets. Benefits range from better investment of resources, whether for government services or for sales promotions, to more effective medical treatments. However, real insights can be obtained only from data that are accurate and complete.

New Arctic map, with August 2015 Russian claims shown in pale yellow

The Truth about Politics and Cartography: Mapping Claims to the Arctic Seabed

August 19, 2015 8:40 am | by Philip Steinberg, Durham University | Comments

While maps can certainly enlighten and educate, they can just as easily be used to support certain political narratives. With this in mind, Durham University’s Centre for Borders Research (IBRU) has updated its map showing territorial claims to the Arctic seabed following a revised bid submitted by Russia to the United Nations on August 4, 2015. The decision to release the map was not made lightly.

Andy Weir imagined just what would happen when an astronaut was accidentally left behind on a mission to the Red Planet. What would this astronaut have to do to survive for a period of time much longer than his supplies were scheduled to last? So, Andy We

Aliens Among Us: Andy Weir’s The Martian Transports Him to Another World

August 18, 2015 10:07 am | by Randy C. Hice | Comments

Andy Weir is used to living on different worlds. For years, he pictured Martian landscapes in his mind, complete with all of the deadly threats presented by a planet bathed in radiation and the prospect that a human walking about would die in a very, very short time. Weir imagined just what would happen when an astronaut was accidentally left behind on a mission to the Red Planet. What would this astronaut have to do to survive?

ISU researchers are using HPC systems to understand how weather patterns affect crop plantings, such as these soybeans standing in water due to heavy Midwest rains. Courtesy of Palle Peterson, Iowa State University (Published in the ICM newsletter, June 2

When It Rains, It Pours: HPC@ISU Powers Advanced Agronomy Research

August 17, 2015 2:06 pm | by Ken Strandberg | Comments

The American Midwest has recently seen significant precipitation and two major floods — in 1998 and 2008 — from extraordinary rain falls across the Great Plains. What is causing this dramatic change in weather patterns? Is it the warming planet? Are the crops themselves influencing dramatic weather changes taking place over the last couple decades? HPC clusters at ISU are being used to help discover answers to these questions.

Two-dimensional contour plot showing the effect of temperature and pH on compound yield

Design of Experiments Improves Peptide Bond Yield from 20% to 76%

August 11, 2015 4:03 pm | by Manpreet Bhatti, Ph.D. and Palwinder Singh, Ph.D. | Comments

DOE was used to demonstrate that temperature and pH function synergistically in the process of peptide bond formation. The optimized reaction was used to achieve sequence-specific and nonracemized synthesis of a tetrapeptide and pentapeptide at high yields. This is believed to be the first published report of constructing sequence-specific peptides in a noncatalyzed reaction.

Illustration of a cellulosomal structure. Cellulosomes are highly-efficient molecular machines that can degrade plant fibers. Red is the scafoldin of the cellulosome, where most of the Cohesins are ,and blue are the enzymatic domains where most of the Doc

Cellulosomes: One of Life’s Strongest Biomolecular Bonds Discovered with Use of Supercomputers

July 28, 2015 3:40 pm | by Linda Barney | Comments

Researchers have discovered one of nature’s strongest mechanical bonds on a protein network called cellulosomes. The cellulosome network includes bacteria that contain enzymes that can effectively dismantle cellulose and chemically catalyze it. The discovery was aided by use of supercomputers to simulate interactions at the atomic level.

The rate of growth in computing power predicted by Gordon Moore (pictured) could be slowing. Courtesy of Steve Jurvetson, CC BY

Moore’s Law is 50 Years Old, but will it Continue?

July 27, 2015 9:06 am | by Jonathan Borwein and David H. Bailey | Comments

It’s been 50 years since Gordon Moore, one of the founders of the microprocessor company Intel, gave us Moore’s Law. This says that the complexity of computer chips ought to double roughly every two years. Now the current CEO of Intel, Brian Krzanich, is saying the days of Moore’s Law may be coming to an end as the time between new innovation appears to be widening.

The TOP500 project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing.

TOP500 Answers the Most Frequently Asked Questions about the Project and the List

July 23, 2015 3:26 pm | by TOP500 | Comments

The TOP500 project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing. Twice a year, a list of the sites operating the 500 most powerful computer systems is assembled and released. The best performance on the Linpack benchmark is used as a performance measure for ranking the computer systems.

Tianhe-2, a supercomputer developed by China’s National University of Defense Technology, has retained its position as the world’s No. 1 system, according to the June 2015 edition.

A Look Back at Top #1 Systems on the TOP500 List

July 23, 2015 11:23 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

The TOP500 list provides international rankings of general-purpose HPC systems that are in common use for high-end applications. Twice a year, in June and November, a new list featuring the sites operating the 500 most powerful computer systems is assembled and released. The project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing.

Optimization of workflows in a modern HPC environment is a complex task that requires significant software support.

Optimizing Workflows in Globally Distributed, Heterogeneous HPC Computing Environments

July 8, 2015 1:56 pm | by Rob Farber | Comments

Optimization of workflows in a modern HPC environment is now a globally distributed, heterogeneous-hardware-challenged task for users and systems administrators. Not only is this a mouthful to say, it is also a complex task that requires significant software support.

Steve Conway is Research VP, HPC at IDC.

Thoughts on the Exascale Race: HPC has become a mature market

July 8, 2015 12:57 pm | by Steve Conway | Comments

As the HPC community hurtles toward the exascale era, it’s good to pause and reflect. Here are a few thoughts… The DOE CORAL procurement signaled that extreme-performance supercomputers from the U.S., Japan, China and Europe should reach the 100-300PF range in 2017-2018. That’s well short of DOE’s erstwhile stretch goal of deploying a trim, energy-efficient peak exaflop system in 2018 or so, but still impressive. It would appear...

Combining easy-to-use statistics with interactive graphics

Software Review: Partek Genomics Suite 6.6

July 7, 2015 3:58 pm | by John A. Wass, Ph.D. | Comments

Your corresponding editor really loves to review these genomics programs, as genomics (the study of the entire gene complement in an organism) is his area of research, and an exciting one at that. It is now at the center of a cutting-edge movement within the area of personalized medicine. The software for doing this is highly advanced in that its functioning mates the precision of mathematics/statistics with the variability of biology...

A 3-D model of the human brain, which considers cortical architecture, connectivity, genetics and function. Courtesy of Research Centre Juelich

Advanced Computation Plays Key Role in Accelerating Life Sciences Research

July 7, 2015 12:11 pm | by Thomas Lippert, Ph.D., and Manuel Peitsch, Ph.D. | Comments

Life scientists are increasingly reliant on advanced computation to advance their research. Two very prominent examples of this trend will be presented this summer at the ISC High Performance conference, which will feature a five-day technical program focusing on HPC technologies and their application in scientific fields, as well as their adoption in commercial environments.

R.D. McDowall is Director, R D McDowall Limited.

Review and Critique of the MRHA Data Integrity Guidance for Industry

July 7, 2015 9:34 am | by R.D. McDowall, Ph.D. | Comments

This new series of four articles takes a look at the UK’s Medicines and Heathcare products Regulatory Agency (MHRA) guidance for industry on data integrity. The focus of these articles is an interpretation and critique of the second version of the MRHA data integrity guidance for laboratories working to European Union GMP regulations, such as analytical development in R&D and quality control in pharmaceutical manufacturing.

A plug-and-play standardized protocol will simplify processes.

Linking an Instrument to a Tablet: Still a bridge too far?

July 6, 2015 10:26 am | by Peter J. Boogaard | Comments

Over 75 percent of a laboratory experiment or analysis starts with some kind of manual process, such as weighing. The majority of the results of these measurements are still written down manually on a piece of paper or re-typed into a computer or tablet. ELN and mobile devices like tablets are married to each other. However, to connect a balance, you need to be an IT professor...



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.