Subscribe to Scientific Computing Articles

Mobile Tech between a Rock and a Hard Place

January 6, 2014 2:17 pm | by Rob Farber | Comments

Mobile technology is where the money is right now in computer technology. Current leadership class supercomputers are “wowing” the HPC world with petaflop/s performance through the combined use of several thousand GPUs or Intel Xeon Phi coprocessors, but in reality the sale of a few thousand of these devices is insignificant when compared against the 1.5 billon cellphone processors and 190 million tablet processors ...


Unscrambler X 10.3: Useful Niche Software

January 6, 2014 1:38 pm | by John A. Wass, Ph.D. | Comments

Software Review: Unscrambler statistical software is geared to two of the most useful areas of industrial R&D, namely multivariate analysis and experimental design. The latest version (10.3) of this useful niche software has a number of additions and upgrades, including regression and classification methods, exploratory data analysis tools, predictive modeling, extensive pre-processing options, and descriptive statistics with tests.


The 12 Days of Pascal's Triangular Christmas

December 24, 2013 10:05 am | by Michael Rose, University of Newcastle | Comments

One of the most magical aspects of mathematics is the ability for complex and intricate structures to emerge from the simplest of rules. Few mathematical objects are simpler to create — and few weave such a variety of intricate patterns — as Pascal’s marvellous triangle.


DOE to Showcase Computational Science Expertise at SC13 Conference

December 5, 2013 4:34 pm | by DOE | Comments

After unexpectedly missing the opportunity to exhibit their expertise at SC12, the Department of Energy (DOE) national laboratories will return to the conference exhibition at the SC13 international conference for high performance computing, networking, storage and analysis, to be held November 17 to 22 at the Colorado Convention Center (CCC) in Denver.


Preserving Sanity in the Face of Rampant Technology Change

December 4, 2013 4:16 pm | by Rob Farber | Comments

Change is a given in the technology world as new products excite interest, generate sales and, ultimately, define profitability. No technology company is “too big to fail,” which means that the current market giants recognize they can easily become a name from the past, like SUN Microsystems and Digital Equipment Corporation, unless they aggressively innovate.


Meet HPC Innovator Taghrid Samak

December 3, 2013 4:03 pm | by Jon Bashor, Berkeley Lab Computational Research Division | Comments

Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.


A Q&A with Taghrid Samak, LBNL Research Scientist

December 3, 2013 3:50 pm | by Jon Bashor, Lawrence Berkeley National Laboratory | Comments

Taghrid Samak of Berkeley Lab’s Computational Research Division admits with a laugh that she wasn’t one of those kids who started programming on the home computer at age 10. And if she hadn’t followed her father’s advice, she might have ended up looking for political solutions to pressing problems, rather than working on computational approaches to scientific challenges.


How Does your HPC Service Compare with the Industry’s Best?

November 27, 2013 11:44 am | by Andrew Jones | Comments

It is easy to cast jealous eyes towards the most powerful supercomputers in the world, e.g. Tianhe-2 with its three million cores of Xeon and Phi processors, or Titan with its 18,000 GPUs, and wish you had the budget to deploy such facilities. However, most HPC service managers and users must return from such whims, plummeting back to the much smaller-scale HPC that is their reality.


Global Pilot Study Zeroes in on HPC ROI

November 26, 2013 11:31 am | by Steve Conway | Comments

Not long ago, I wrote in this publication that the end of the Cold War inaugurated a multi-year shift in government funding rationales for HPC. The historical heavy tilt toward national security and advanced science/engineering is slowly being counterbalanced by arguments for returns-on-investment (ROI).


Supercomputer Simulations Help Lay Foundation for Better, Greener Concrete

November 25, 2013 11:12 am | by Jim Collins, Argonne National Laboratory | Comments

NIST researchers are conducting simulations at the Argonne Leadership Computing Facility to advance the measurement science of concrete and to gain a fundamental understanding of how it flows. The NIST research team is combining data from large-scale simulations with theoretical work and physical experiments to create Standard Reference Materials (SRMs) for concrete to allow for more accurate viscosity measurements.


FDA’s Focus on Laboratory Data Integrity – Part 2

September 15, 2013 3:31 pm | by R.D. McDowall, Ph.D. | Comments

A further look at this current emphasis and a few problems inspectors have identified. The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. If you think that the warning letter citations quoted in part 1 of this article were bad, give a thought for another company...


FDA’s Focus on Laboratory Data Integrity – Part 1

September 10, 2013 3:04 pm | by R.D. McDowall, Ph.D. | Comments

The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. This paper looks at what is required for data integrity from the basis of the GMP regulations. It presents examples of non-compliances found in warning letters and a regulatory action from the U.S. Food and Drug Administraqtion (FDA).


Less is More: Adopting a Self-documenting Paperless Mindset

September 10, 2013 10:52 am | by Peter J. Boogaard | Comments

The paper versus paperless discussion is as old as the existence of commercial computers. In 1975, just after the introduction of the first personal computer Scelbi (SCientific, ELectronic and BIological), Business Week already predicted that computer records would soon completely replace paper. We all know that it took over 25 years before paperless operations were accepted and successfully adopted in our daily work.


Hardware for Big Data, Graphs and Large-scale Computation

September 9, 2013 9:58 am | by Rob Farber | Comments

Recent announcements by Intel and NVIDIA indicate that massively parallel computing with GPUs and Intel Xeon Phi will no longer require passing data via the PCIe bus. The bad news is that these standalone devices are still in the design phase and are not yet available for purchase.


Book Review: Applied Regression Modeling

September 1, 2013 5:47 pm | by John A. Wass, Ph.D. | Comments

It is always a pleasure to review a text that is easy to read and understand, when targeted to a novice audience. This book was written for business majors at the junior undergraduate level, and not statistics majors. However, it is recommended that readers have a course in introductory statistics before using this book.



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.