Advertisement
Articles
Subscribe to Scientific Computing Articles
Artist’s impression of a proton depicting three interacting valence quarks inside. Courtesy of Jefferson Lab

HPC Community Experts Weigh in on Code Modernization

December 17, 2014 4:33 pm | by Doug Black | Comments

Sense of urgency and economic impact emphasized: The “hardware first” ethic is changing. Hardware retains the glamour, but there is now the stark realization that the newest parallel supercomputers will not realize their full potential without reengineering the software code to efficiently divide computational problems among the thousands of processors that comprise next-generation many-core computing platforms.

TOPICS:
Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Today’s Enterprising GPUs

November 20, 2014 2:09 pm | by Rob Farber | Comments

HPC has always embraced the leading edge of technology and, as such, acts as the trailbreaker and scout for enterprise and business customers. HPC has highlighted and matured the abilities of previously risky devices, like GPUs, that enterprise customers now leverage to create competitive advantage. GPUs have moved beyond “devices with potential” to “production devices” that are used for profit generation.

TOPICS:
John Wass is a statistician based in Chicago, IL.

Exploration and Analysis of DNA Microarray and Other High-Dimensional Data

November 18, 2014 3:10 pm | by John A. Wass, Ph.D. | Comments

The introduction of newer sequencing methodologies, DNA microarrays and high-throughput technology has resulted in a deluge of large data sets that require new strategies to clean, normalize and analyze the data. All of these and more are covered in approximately 300 pages with extraordinary clarity and minimal mathematics.

TOPICS:
Advertisement
Karol Kowalski, Capability Lead for NWChem Development, works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.

Advancing Computational Chemistry with NWChem

November 18, 2014 3:07 pm | by Mike Bernhardt, HPC Community Evangelist, Intel | Comments

An interview with PNNL’s Karol Kowalski, Capability Lead for NWChem Development - NWChem is an open source high performance computational chemistry tool developed for the Department of Energy at Pacific Northwest National Lab in Richland, WA. I recently visited with Karol Kowalski, Capability Lead for NWChem Development, who works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.

TOPICS:
Steve Conway is Research VP, HPC at IDC.

Small and Medium Enterprises Enter the Limelight

November 14, 2014 11:43 am | by Steve Conway | Comments

A decade of close scrutiny has shed much more light on the technical computing needs of small and medium enterprises (SMEs), but they are still shrouded in partial darkness. That’s hardly surprising for a diverse global group with millions of members ranging from automotive suppliers and shotgun genomics labs to corner newsstands and strip mall nail salons.

TOPICS:
John Kirkley, President of Kirkley Communications, is a writer and editor who specializes in HPC.

A New Dawn: Bringing HPC to Smaller Manufacturers

November 13, 2014 11:26 am | by John Kirkley | Comments

Folk wisdom can sometimes be right on target. For example, there’s that old bromide about leading a horse to water. In this case, the water is high performance computing, and the reluctant equine is the huge base of small- to medium-sized manufacturers in the U.S. According to the National Center for Manufacturing Sciences, there are approximately 300,000 manufacturers in the U.S. Over 95 percent of them can be characterized as SMMs.

TOPICS:
Specific bits of a digital image file that have been replaced with the bits of a secret steganographic payload permit a covert agent to post top-secret documents on their Facebook wall by simply uploading what appear to be cute images of kittens on any ty

Leading the Eyewitness: Digital Image Forensics in a Megapixel World

November 12, 2014 3:42 pm | by William Weaver, Ph.D. | Comments

Current research in the area of digital image forensics is developing better ways to convert image files into frequencies, such as using wavelet transforms in addition to more traditional cosine transforms and more sensitive methods for determining if each area of an image belongs to the whole.

TOPICS:
Recent gender diversity reports from Google, Facebook and Apple (to name a few) have spurred a number of positive efforts to bring more women into computer science, including the SC14 Women in HPC workshop, NVIDIA’s Women who CUDA campaign, and Google’s $

Women Who Compute: Overcoming Lack of Gender Diversity in Science and Technology

November 11, 2014 3:17 pm | by Rob Farber | Comments

Recent gender diversity reports from Google, Facebook and Apple (to name a few) have spurred a number of positive efforts to bring more women into computer science, including the SC14 Women in High Performance Computing workshop, NVIDIA’s Women who CUDA campaign, and Google’s $50M Women Who Code program.

TOPICS:
Advertisement
The Renaissance Computing Institute’s high performance computing cluster quickly generates better intelligence about coastal hazards and risk. Courtesy of RENCI

HPC Matters to our Quality of Life and Prosperity

November 11, 2014 2:22 pm | by Don Johnston | Comments

The complexity of high-end computing technology makes it largely invisible to the public. HPC simply lacks the Sputnik sex appeal of the space race, to which current global competition in supercomputing is often compared. Rather, it is seen as the exclusive realm of academia and national labs. Yet, its impact reaches into almost every aspect of daily life. Organizers of SC14 had this reach in mind when selecting the “HPC Matters” theme.

TOPICS:
The IPCC at Lawrence Berkeley National Laboratory is performing code modernization work on NWChem.

A Focus on Code Modernization: Observing Year One of the Intel Parallel Computing Centers

November 10, 2014 11:11 am | by Doug Black | Comments

One year ago, recognizing a rapidly emerging challenge facing the HPC community, Intel launched the Parallel Computing Centers program. With the great majority of the world’s technical HPC computing challenges being handled by systems based on Intel architecture, the company was keenly aware of the growing need to modernize a large portfolio of public domain scientific applications, to prepare these critically important codes for multi-core

TOPICS:
JMP 11: Remarkable Statistics, Graphics and Integration Designed for the Technician, Scientist, Engineer and Businessperson

JMP 11: Remarkable Statistics, Graphics and Integration

November 7, 2014 10:30 am | by John A. Wass, Ph.D. | Comments

It should come as no surprise to readers of this column that JMP is a personal favorite and, along with SAS, one of my most-used programs. There are a number of reasons for this. Of the many advantages that most packages can offer, breadth and depth of the statistics offered, quality of the diagnostics, interconnectivity of graphics with both data and analyses, and ease-of-use issues are uppermost in my mind as most desirable.

TOPICS:
Michael Elliott is CEO of Atrium  Research & Consulting.

Déjà Vu All Over Again: Knowledge management is not an IT problem, but a challenge to the culture of an organization

November 7, 2014 8:48 am | by Michael H. Elliott | Comments

In the late 1990s and the early 2000s, “Knowledge Management” (KM) was all the rage. Companies invested millions on enterprise content management (ECM) systems and teams of KM practitioners. It was believed that the codification of all knowledge assets across the enterprise would lead to new insights and higher levels of innovation.

TOPICS:
Highly motivated to organize the Argonne Training Program on Extreme-Scale Computing, Paul Messina reflects on what makes the program unique and a can’t-miss opportunity for the next generation of HPC scientists.

A Q&A with Paul Messina, Director of Science for the Argonne Leadership Computing Facility

November 6, 2014 4:22 pm | by Brian Grabowski, Argonne National Laboratory | Comments

Highly motivated to organize the Argonne Training Program on Extreme-Scale Computing, Paul Messina reflects on what makes the program unique and a can’t-miss opportunity for the next generation of HPC scientists. ATPESC is an intense, two-week program that covers most of the topics and skills necessary to conduct computational science and engineering research on today’s and tomorrow’s high-end computers.

TOPICS:
R.D. McDowall is Principal, McDowall Consulting.

The Cloud Meets GMP Regulations – Part 4: Selecting a Cloud Service Provider

November 6, 2014 3:16 pm | by R D McDowall | Comments

The purpose of this series is to discuss the impact of GMP (Good Manufacturing Practice) regulations on cloud computing and to debate some of the regulatory issues facing an organization contemplating this approach. In this part, we look at a process to select a suitable hosting provider that can demonstrate compliance with GMP and possession of qualified IT infrastructure.

TOPICS:
R.D. McDowall is Principal, McDowall Consulting.

The Cloud Meets GMP Regulations – Part 3: Options for Auditing a Cloud Service Provider

November 3, 2014 2:53 pm | by R D McDowall | Comments

The purpose of this series is to discuss the impact of GMP (Good Manufacturing Practice) regulations on cloud computing and to debate some of the regulatory issues facing an organization contemplating this approach. In this part, we look at the options for auditing a cloud service provider.

TOPICS:

Pages

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading