Advertisement
Articles
Subscribe to Scientific Computing Articles
Illustration of a cellulosomal structure. Cellulosomes are highly-efficient molecular machines that can degrade plant fibers. Red is the scafoldin of the cellulosome, where most of the Cohesins are ,and blue are the enzymatic domains where most of the Doc

Cellulosomes: One of Life’s Strongest Biomolecular Bonds Discovered with Use of Supercomputers

July 28, 2015 3:40 pm | by Linda Barney | Comments

Researchers have discovered one of nature’s strongest mechanical bonds on a protein network called cellulosomes. The cellulosome network includes bacteria that contain enzymes that can effectively dismantle cellulose and chemically catalyze it. The discovery was aided by use of supercomputers to simulate interactions at the atomic level.

TOPICS:
The rate of growth in computing power predicted by Gordon Moore (pictured) could be slowing. Courtesy of Steve Jurvetson, CC BY

Moore’s Law is 50 Years Old, but will it Continue?

July 27, 2015 9:06 am | by Jonathan Borwein and David H. Bailey | Comments

It’s been 50 years since Gordon Moore, one of the founders of the microprocessor company Intel, gave us Moore’s Law. This says that the complexity of computer chips ought to double roughly every two years. Now the current CEO of Intel, Brian Krzanich, is saying the days of Moore’s Law may be coming to an end as the time between new innovation appears to be widening.

TOPICS:
The TOP500 project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing.

TOP500 Answers the Most Frequently Asked Questions about the Project and the List

July 23, 2015 3:26 pm | by TOP500 | Comments

The TOP500 project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing. Twice a year, a list of the sites operating the 500 most powerful computer systems is assembled and released. The best performance on the Linpack benchmark is used as a performance measure for ranking the computer systems.

TOPICS:
Advertisement
Tianhe-2, a supercomputer developed by China’s National University of Defense Technology, has retained its position as the world’s No. 1 system, according to the June 2015 edition.

A Look Back at Top #1 Systems on the TOP500 List

July 23, 2015 11:23 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

The TOP500 list provides international rankings of general-purpose HPC systems that are in common use for high-end applications. Twice a year, in June and November, a new list featuring the sites operating the 500 most powerful computer systems is assembled and released. The project was started in 1993 to provide a reliable basis for tracking and detecting trends in high-performance computing.

TOPICS:
Optimization of workflows in a modern HPC environment is a complex task that requires significant software support.

Optimizing Workflows in Globally Distributed, Heterogeneous HPC Computing Environments

July 8, 2015 1:56 pm | by Rob Farber | Comments

Optimization of workflows in a modern HPC environment is now a globally distributed, heterogeneous-hardware-challenged task for users and systems administrators. Not only is this a mouthful to say, it is also a complex task that requires significant software support.

TOPICS:
Steve Conway is Research VP, HPC at IDC.

Thoughts on the Exascale Race: HPC has become a mature market

July 8, 2015 12:57 pm | by Steve Conway | Comments

As the HPC community hurtles toward the exascale era, it’s good to pause and reflect. Here are a few thoughts… The DOE CORAL procurement signaled that extreme-performance supercomputers from the U.S., Japan, China and Europe should reach the 100-300PF range in 2017-2018. That’s well short of DOE’s erstwhile stretch goal of deploying a trim, energy-efficient peak exaflop system in 2018 or so, but still impressive. It would appear...

TOPICS:
Combining easy-to-use statistics with interactive graphics

Software Review: Partek Genomics Suite 6.6

July 7, 2015 3:58 pm | by John A. Wass, Ph.D. | Comments

Your corresponding editor really loves to review these genomics programs, as genomics (the study of the entire gene complement in an organism) is his area of research, and an exciting one at that. It is now at the center of a cutting-edge movement within the area of personalized medicine. The software for doing this is highly advanced in that its functioning mates the precision of mathematics/statistics with the variability of biology...

TOPICS:
A 3-D model of the human brain, which considers cortical architecture, connectivity, genetics and function. Courtesy of Research Centre Juelich

Advanced Computation Plays Key Role in Accelerating Life Sciences Research

July 7, 2015 12:11 pm | by Thomas Lippert, Ph.D., and Manuel Peitsch, Ph.D. | Comments

Life scientists are increasingly reliant on advanced computation to advance their research. Two very prominent examples of this trend will be presented this summer at the ISC High Performance conference, which will feature a five-day technical program focusing on HPC technologies and their application in scientific fields, as well as their adoption in commercial environments.

TOPICS:
Advertisement
R.D. McDowall is Director, R D McDowall Limited.

Review and Critique of the MRHA Data Integrity Guidance for Industry

July 7, 2015 9:34 am | by R.D. McDowall, Ph.D. | Comments

This new series of four articles takes a look at the UK’s Medicines and Heathcare products Regulatory Agency (MHRA) guidance for industry on data integrity. The focus of these articles is an interpretation and critique of the second version of the MRHA data integrity guidance for laboratories working to European Union GMP regulations, such as analytical development in R&D and quality control in pharmaceutical manufacturing.

TOPICS:
A plug-and-play standardized protocol will simplify processes.

Linking an Instrument to a Tablet: Still a bridge too far?

July 6, 2015 10:26 am | by Peter J. Boogaard | Comments

Over 75 percent of a laboratory experiment or analysis starts with some kind of manual process, such as weighing. The majority of the results of these measurements are still written down manually on a piece of paper or re-typed into a computer or tablet. ELN and mobile devices like tablets are married to each other. However, to connect a balance, you need to be an IT professor...

TOPICS:
William Weaver is an associate professor in the Department of Integrated Science, Business, and Technology at La Salle University.

If These Walls Could Talk: The Internet of Things will save lives, time and treasure

July 3, 2015 9:02 am | by William Weaver, Ph.D. | Comments

On April 15, 2014, a Wilmington, DE, business owner, was driving home on the I-495 bypass, as he had done for 25 years. While traveling on the twin three-lane bridges that span the Christiana River, he noticed that the normally parallel bridges were offset by nearly 18 inches in height and the apparent listing of one of the lanes had created a large gap between the bridges through which the ground was visible, some four stories below.

TOPICS:
The Ubimet Weather Cockpit allows golf courses, race venues and other clients to access site specific weather information unique to their topography.

Weather Matters: Enabling Precise, Real-time Forecasts

June 11, 2015 3:47 pm | by Ken Strandberg | Comments

Much of the world’s industries are affected by weather. UBIMET is one of the world’s leading private weather service providers. The company offers a range of precise, real-time micro-climate forecasting and alerts, historical weather data, and other services to several million customers around the globe. UBIMET’s competitive advantage lies in the complex character of their solutions and depth of their science and technology.

TOPICS:
R.D. McDowall is Principal, McDowall Consulting.

Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 4: System Design, Definitions and Overall Assessment

May 29, 2015 1:56 pm | by R.D. McDowall, Ph.D. | Comments

This is the final part of a series reviewing and critiquing recent MHRA guidance for industry on data integrity. The first part of the series provided a background to the guidance document and discussed the introduction. The second part reviewed the data governance system, and the third part discussed data criticality and lifecycle. This part reviews the system design, some of the definitions, and finishes with an overall assessment.

TOPICS:
R.D. McDowall is Principal, McDowall Consulting.

Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 3: Data Criticality and Data Life Cycle

May 29, 2015 12:21 pm | by R.D. McDowall, Ph.D. | Comments

This is the third of a four-part series reviewing and critiquing the recent Medicines and Healthcare products Regulatory Agency guidance for industry document on data integrity. The first part of the series provided a background to the guidance document and discussed the introduction to the document. The second part reviewed and discussed the data governance system. In this part, we will look at data criticality and the data life cycle.

TOPICS:
R.D. McDowall is Principal, McDowall Consulting.

Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 2: Data Governance System

May 29, 2015 11:27 am | by R.D. McDowall, Ph.D. | Comments

This is the second of a four-part series reviewing and critiquing the recent Medicines and Healthcare products Regulatory Agency (MHRA) guidance for industry document on data integrity. The first part of the series provided a background to the guidance document and discussed the introduction to the document. In this part, we will look at the MHRA requirement for a data governance system.

TOPICS:

Pages

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading