Advertisement
Articles
Subscribe to Scientific Computing Articles
View Sample

FREE Email Newsletter

NewsWire

Data Integrity in a Nutshell: Industry must take bold steps to assure the data used for drug quality decisions is trustworthy

January 7, 2014 12:31 pm | by Mark E. Newton | Comments

Regulatory inspectors have started digging much deeper into data, no longer accepting batch release data and supportive testing at face value. Even worse, this effort is justified: they have cited a number of firms for violations of data integrity, a most fundamental bond of trust between manufacturers and the regulators that inspect them. Industry must take bold steps to assure the data used for drug quality decisions is trustworthy...

TOPICS:

Comparison of FDA and EU Regulations for Audit Trails

January 7, 2014 12:02 pm | by R.D. McDowall | Comments

Data integrity is a current hot topic with regulatory agencies, as seen with recent publications in this magazine, and audit trails are an important aspect of ensuring this in computerized systems. The purpose of this article is to compare and contrast the EU and FDA GMP regulatory requirements for computerized system audit trails.

TOPICS:

A Fresh Look at the AnIML Data Standard

January 7, 2014 11:47 am | by Burkhard Schaefer | Comments

One of the challenges in laboratory data management is the handling and exchange of experiment data. Many vendors provide excellent instruments, but most produce data in their own proprietary formats. This leads to major difficulties for data processing, collaboration, instrument integration and archiving. The ASTM AnIML standardization effort addresses these problems by providing a neutral XML-based format for exchanging scientific data.

TOPICS:
Advertisement

Mobile Tech between a Rock and a Hard Place

January 6, 2014 2:17 pm | by Rob Farber | Comments

Mobile technology is where the money is right now in computer technology. Current leadership class supercomputers are “wowing” the HPC world with petaflop/s performance through the combined use of several thousand GPUs or Intel Xeon Phi coprocessors, but in reality the sale of a few thousand of these devices is insignificant when compared against the 1.5 billon cellphone processors and 190 million tablet processors ...

TOPICS:

Unscrambler X 10.3: Useful Niche Software

January 6, 2014 1:38 pm | by John A. Wass, Ph.D. | Comments

Software Review: Unscrambler statistical software is geared to two of the most useful areas of industrial R&D, namely multivariate analysis and experimental design. The latest version (10.3) of this useful niche software has a number of additions and upgrades, including regression and classification methods, exploratory data analysis tools, predictive modeling, extensive pre-processing options, and descriptive statistics with tests.

TOPICS:

The 12 Days of Pascal's Triangular Christmas

December 24, 2013 10:05 am | by Michael Rose, University of Newcastle | Comments

One of the most magical aspects of mathematics is the ability for complex and intricate structures to emerge from the simplest of rules. Few mathematical objects are simpler to create — and few weave such a variety of intricate patterns — as Pascal’s marvellous triangle.

DOE to Showcase Computational Science Expertise at SC13 Conference

December 5, 2013 4:34 pm | by DOE | Comments

After unexpectedly missing the opportunity to exhibit their expertise at SC12, the Department of Energy (DOE) national laboratories will return to the conference exhibition at the SC13 international conference for high performance computing, networking, storage and analysis, to be held November 17 to 22 at the Colorado Convention Center (CCC) in Denver.

TOPICS:

Preserving Sanity in the Face of Rampant Technology Change

December 4, 2013 4:16 pm | by Rob Farber | Comments

Change is a given in the technology world as new products excite interest, generate sales and, ultimately, define profitability. No technology company is “too big to fail,” which means that the current market giants recognize they can easily become a name from the past, like SUN Microsystems and Digital Equipment Corporation, unless they aggressively innovate.

TOPICS:
Advertisement

Meet HPC Innovator Taghrid Samak

December 3, 2013 4:03 pm | by Jon Bashor, Berkeley Lab Computational Research Division | Comments

Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.

TOPICS:

A Q&A with Taghrid Samak, LBNL Research Scientist

December 3, 2013 3:50 pm | by Jon Bashor, Lawrence Berkeley National Laboratory | Comments

Taghrid Samak of Berkeley Lab’s Computational Research Division admits with a laugh that she wasn’t one of those kids who started programming on the home computer at age 10. And if she hadn’t followed her father’s advice, she might have ended up looking for political solutions to pressing problems, rather than working on computational approaches to scientific challenges.

TOPICS:

How Does your HPC Service Compare with the Industry’s Best?

November 27, 2013 11:44 am | by Andrew Jones | Comments

It is easy to cast jealous eyes towards the most powerful supercomputers in the world, e.g. Tianhe-2 with its three million cores of Xeon and Phi processors, or Titan with its 18,000 GPUs, and wish you had the budget to deploy such facilities. However, most HPC service managers and users must return from such whims, plummeting back to the much smaller-scale HPC that is their reality.

TOPICS:

Global Pilot Study Zeroes in on HPC ROI

November 26, 2013 11:31 am | by Steve Conway | Comments

Not long ago, I wrote in this publication that the end of the Cold War inaugurated a multi-year shift in government funding rationales for HPC. The historical heavy tilt toward national security and advanced science/engineering is slowly being counterbalanced by arguments for returns-on-investment (ROI).

TOPICS:

Supercomputer Simulations Help Lay Foundation for Better, Greener Concrete

November 25, 2013 11:12 am | by Jim Collins, Argonne National Laboratory | Comments

NIST researchers are conducting simulations at the Argonne Leadership Computing Facility to advance the measurement science of concrete and to gain a fundamental understanding of how it flows. The NIST research team is combining data from large-scale simulations with theoretical work and physical experiments to create Standard Reference Materials (SRMs) for concrete to allow for more accurate viscosity measurements.

TOPICS:

FDA’s Focus on Laboratory Data Integrity – Part 2

September 15, 2013 3:31 pm | by R.D. McDowall | Comments

A further look at this current emphasis and a few problems inspectors have identified. The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. If you think that the warning letter citations quoted in part 1 of this article were bad, give a thought for another company...

TOPICS:

FDA’s Focus on Laboratory Data Integrity – Part 1

September 10, 2013 3:04 pm | by R.D. McDowall | Comments

The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. This paper looks at what is required for data integrity from the basis of the GMP regulations. It presents examples of non-compliances found in warning letters and a regulatory action from the U.S. Food and Drug Administraqtion (FDA).

TOPICS:

Pages

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading