Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
Encryption and nuclear weapons are two easily recognized examples where a combinatorial explosion is a sought after characteristic. In the software development world, combinatorial explosions are bad. In particular, it is far too easy to become lost in the minutia of writing code that can run efficiently on NVIDIA GPUs, AMD GPUs, x86, ARM and Intel Xeon Phi while also addressing the numerous compiler and user interface vagaries
Data Integrity in a Nutshell: Industry must take bold steps to assure the data used for drug quality decisions is trustworthyJanuary 7, 2014 12:31 pm | by Mark E. Newton | Comments
Regulatory inspectors have started digging much deeper into data, no longer accepting batch release data and supportive testing at face value. Even worse, this effort is justified: they have cited a number of firms for violations of data integrity, a most fundamental bond of trust between manufacturers and the regulators that inspect them. Industry must take bold steps to assure the data used for drug quality decisions is trustworthy...
Data integrity is a current hot topic with regulatory agencies, as seen with recent publications in this magazine, and audit trails are an important aspect of ensuring this in computerized systems. The purpose of this article is to compare and contrast the EU and FDA GMP regulatory requirements for computerized system audit trails.
One of the challenges in laboratory data management is the handling and exchange of experiment data. Many vendors provide excellent instruments, but most produce data in their own proprietary formats. This leads to major difficulties for data processing, collaboration, instrument integration and archiving. The ASTM AnIML standardization effort addresses these problems by providing a neutral XML-based format for exchanging scientific data.
Mobile technology is where the money is right now in computer technology. Current leadership class supercomputers are “wowing” the HPC world with petaflop/s performance through the combined use of several thousand GPUs or Intel Xeon Phi coprocessors, but in reality the sale of a few thousand of these devices is insignificant when compared against the 1.5 billon cellphone processors and 190 million tablet processors ...
Software Review: Unscrambler statistical software is geared to two of the most useful areas of industrial R&D, namely multivariate analysis and experimental design. The latest version (10.3) of this useful niche software has a number of additions and upgrades, including regression and classification methods, exploratory data analysis tools, predictive modeling, extensive pre-processing options, and descriptive statistics with tests.
One of the most magical aspects of mathematics is the ability for complex and intricate structures to emerge from the simplest of rules. Few mathematical objects are simpler to create — and few weave such a variety of intricate patterns — as Pascal’s marvellous triangle.
After unexpectedly missing the opportunity to exhibit their expertise at SC12, the Department of Energy (DOE) national laboratories will return to the conference exhibition at the SC13 international conference for high performance computing, networking, storage and analysis, to be held November 17 to 22 at the Colorado Convention Center (CCC) in Denver.
Change is a given in the technology world as new products excite interest, generate sales and, ultimately, define profitability. No technology company is “too big to fail,” which means that the current market giants recognize they can easily become a name from the past, like SUN Microsystems and Digital Equipment Corporation, unless they aggressively innovate.
Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.
Taghrid Samak of Berkeley Lab’s Computational Research Division admits with a laugh that she wasn’t one of those kids who started programming on the home computer at age 10. And if she hadn’t followed her father’s advice, she might have ended up looking for political solutions to pressing problems, rather than working on computational approaches to scientific challenges.
It is easy to cast jealous eyes towards the most powerful supercomputers in the world, e.g. Tianhe-2 with its three million cores of Xeon and Phi processors, or Titan with its 18,000 GPUs, and wish you had the budget to deploy such facilities. However, most HPC service managers and users must return from such whims, plummeting back to the much smaller-scale HPC that is their reality.
Not long ago, I wrote in this publication that the end of the Cold War inaugurated a multi-year shift in government funding rationales for HPC. The historical heavy tilt toward national security and advanced science/engineering is slowly being counterbalanced by arguments for returns-on-investment (ROI).
NIST researchers are conducting simulations at the Argonne Leadership Computing Facility to advance the measurement science of concrete and to gain a fundamental understanding of how it flows. The NIST research team is combining data from large-scale simulations with theoretical work and physical experiments to create Standard Reference Materials (SRMs) for concrete to allow for more accurate viscosity measurements.