Advertisement
Articles
Subscribe to Scientific Computing Articles
Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source

Helping to Save Lives of Critically Ill Children

February 12, 2015 10:17 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Comments

For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.

TOPICS:
John Wass is a statistician based in Chicago, IL.

Explorations of Mathematical Models in Biology with Maple

February 10, 2015 9:29 am | by John A. Wass, Ph.D. | Comments

The author of this wonderful text delivers a brief, easy-to-absorb, yet very comprehensive text on modeling real-world data with Maple. Maple is software for performing mathematics, with a none-too-steep learning curve. In the introduction, the author is quick to point out that this is neither a detailed textbook of mathematical modeling, nor Maple. It is, however, a very well-written manual of introductory modeling and use of Maple.

TOPICS:
Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Using Profile Information for Optimization, Energy Savings and Procurements

February 9, 2015 12:11 pm | by Rob Farber | Comments

Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution

TOPICS:
Advertisement
The DSCOVR Mission's NISTAR — the  NIST Advanced Radiometer — will measure the Earth’s radiation budget, or whether our planet’s atmosphere is retaining more or less solar energy than it radiates back to space. Courtesy of NASA/DSCOVR

A Measurement Job That’s Truly Out of this World

February 9, 2015 10:23 am | by Chad Boutin, NIST | Comments

Quick, think of a four-letter name beginning with “N” for a federal agency involved in space science. Though NASA or NOAA would rightfully pop into mind first, crossword puzzle aficionados should know that NIST would be a correct answer as well — because the National Institute of Standards and Technology has been an integral part of readying technology for blastoff for decades.

TOPICS:
John Wass is a statistician based in Chicago, IL.

Book Review: Applied Bayesian Modelling, 2nd Edition

January 13, 2015 8:59 am | by John A. Wass, Ph.D. | Comments

This is not a text for the novice. However, for those math/statistics aficionados, there is much to be had. The book’s great strength lies in two areas: the first is Peter Congdon’s generation of an excellent bibliography of the most modern techniques available, and the other is his (slightly) more straightforward explanations of the strengths and weaknesses of these techniques and suggestions for optimizing the results.

TOPICS:
John Wass is a statistician based in Chicago, IL

Qlucore Omics Explorer: Analysis in an Instant

January 8, 2015 4:00 pm | by John A. Wass, Ph.D. | Comments

Qlucore is a software platform for the analysis of genomics, proteomics and related data. As with most statistical and genomics software, it generates an immediate graphic for most analyses. Its specific areas of use include gene expression, protein arrays, DNA methylation, miRNA, proteomics, and pattern and structure identification in multivariate data.

TOPICS:
Peter Boogaard is the founder of Industrial Lab Automation and chairman of the Paperless Lab Academy.

2015 Promises Major Milestones and Demands for Change

January 7, 2015 12:28 pm | by Peter J. Boogaard | Comments

Despite the fact that industries won’t change working processes unless there is a mandatory need to do so, major milestones are expected in 2015 in the battle to adopt data and standardization in our scientific community. The need for deployment of these integration standards to enable efficient sharing of knowledge across our internal and external partners is re-enforced by regulatory bodies.

TOPICS:
Artist’s impression of a proton depicting three interacting valence quarks inside. Courtesy of Jefferson Lab

HPC Community Experts Weigh in on Code Modernization

December 17, 2014 4:33 pm | by Doug Black | Comments

Sense of urgency and economic impact emphasized: The “hardware first” ethic is changing. Hardware retains the glamour, but there is now the stark realization that the newest parallel supercomputers will not realize their full potential without reengineering the software code to efficiently divide computational problems among the thousands of processors that comprise next-generation many-core computing platforms.

TOPICS:
Advertisement
Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Today’s Enterprising GPUs

November 20, 2014 2:09 pm | by Rob Farber | Comments

HPC has always embraced the leading edge of technology and, as such, acts as the trailbreaker and scout for enterprise and business customers. HPC has highlighted and matured the abilities of previously risky devices, like GPUs, that enterprise customers now leverage to create competitive advantage. GPUs have moved beyond “devices with potential” to “production devices” that are used for profit generation.

TOPICS:
John Wass is a statistician based in Chicago, IL.

Exploration and Analysis of DNA Microarray and Other High-Dimensional Data

November 18, 2014 3:10 pm | by John A. Wass, Ph.D. | Comments

The introduction of newer sequencing methodologies, DNA microarrays and high-throughput technology has resulted in a deluge of large data sets that require new strategies to clean, normalize and analyze the data. All of these and more are covered in approximately 300 pages with extraordinary clarity and minimal mathematics.

TOPICS:
Karol Kowalski, Capability Lead for NWChem Development, works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.

Advancing Computational Chemistry with NWChem

November 18, 2014 3:07 pm | by Mike Bernhardt, HPC Community Evangelist, Intel | Comments

An interview with PNNL’s Karol Kowalski, Capability Lead for NWChem Development - NWChem is an open source high performance computational chemistry tool developed for the Department of Energy at Pacific Northwest National Lab in Richland, WA. I recently visited with Karol Kowalski, Capability Lead for NWChem Development, who works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.

TOPICS:
Steve Conway is Research VP, HPC at IDC.

Small and Medium Enterprises Enter the Limelight

November 14, 2014 11:43 am | by Steve Conway | Comments

A decade of close scrutiny has shed much more light on the technical computing needs of small and medium enterprises (SMEs), but they are still shrouded in partial darkness. That’s hardly surprising for a diverse global group with millions of members ranging from automotive suppliers and shotgun genomics labs to corner newsstands and strip mall nail salons.

TOPICS:
John Kirkley, President of Kirkley Communications, is a writer and editor who specializes in HPC.

A New Dawn: Bringing HPC to Smaller Manufacturers

November 13, 2014 11:26 am | by John Kirkley | Comments

Folk wisdom can sometimes be right on target. For example, there’s that old bromide about leading a horse to water. In this case, the water is high performance computing, and the reluctant equine is the huge base of small- to medium-sized manufacturers in the U.S. According to the National Center for Manufacturing Sciences, there are approximately 300,000 manufacturers in the U.S. Over 95 percent of them can be characterized as SMMs.

TOPICS:
Specific bits of a digital image file that have been replaced with the bits of a secret steganographic payload permit a covert agent to post top-secret documents on their Facebook wall by simply uploading what appear to be cute images of kittens on any ty

Leading the Eyewitness: Digital Image Forensics in a Megapixel World

November 12, 2014 3:42 pm | by William Weaver, Ph.D. | Comments

Current research in the area of digital image forensics is developing better ways to convert image files into frequencies, such as using wavelet transforms in addition to more traditional cosine transforms and more sensitive methods for determining if each area of an image belongs to the whole.

TOPICS:
Recent gender diversity reports from Google, Facebook and Apple (to name a few) have spurred a number of positive efforts to bring more women into computer science, including the SC14 Women in HPC workshop, NVIDIA’s Women who CUDA campaign, and Google’s $

Women Who Compute: Overcoming Lack of Gender Diversity in Science and Technology

November 11, 2014 3:17 pm | by Rob Farber | Comments

Recent gender diversity reports from Google, Facebook and Apple (to name a few) have spurred a number of positive efforts to bring more women into computer science, including the SC14 Women in High Performance Computing workshop, NVIDIA’s Women who CUDA campaign, and Google’s $50M Women Who Code program.

TOPICS:

Pages

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading