Everything leading up to the actual coding, figuring out how to make it work, is what Samak enjoys most. One of the problems she is working on with the Department of Energy’s Joint Genome Institute (JGI) is a data mining method to automatically identify errors in genome assembly, replacing the current approach of manually inspecting the assembly.
Taghrid Samak of Berkeley Lab’s Computational Research Division admits with a laugh that she wasn’t one of those kids who started programming on the home computer at age 10. And if she hadn’t followed her father’s advice, she might have ended up looking for political solutions to pressing problems, rather than working on computational approaches to scientific challenges.
It is easy to cast jealous eyes towards the most powerful supercomputers in the world, e.g. Tianhe-2 with its three million cores of Xeon and Phi processors, or Titan with its 18,000 GPUs, and wish you had the budget to deploy such facilities. However, most HPC service managers and users must return from such whims, plummeting back to the much smaller-scale HPC that is their reality.
Not long ago, I wrote in this publication that the end of the Cold War inaugurated a multi-year shift in government funding rationales for HPC. The historical heavy tilt toward national security and advanced science/engineering is slowly being counterbalanced by arguments for returns-on-investment (ROI).
NIST researchers are conducting simulations at the Argonne Leadership Computing Facility to advance the measurement science of concrete and to gain a fundamental understanding of how it flows. The NIST research team is combining data from large-scale simulations with theoretical work and physical experiments to create Standard Reference Materials (SRMs) for concrete to allow for more accurate viscosity measurements.
A further look at this current emphasis and a few problems inspectors have identified. The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. If you think that the warning letter citations quoted in part 1 of this article were bad, give a thought for another company...
The integrity of data generated by a regulated laboratory can make or break a regulatory inspection or audit. This paper looks at what is required for data integrity from the basis of the GMP regulations. It presents examples of non-compliances found in warning letters and a regulatory action from the U.S. Food and Drug Administraqtion (FDA).
The paper versus paperless discussion is as old as the existence of commercial computers. In 1975, just after the introduction of the first personal computer Scelbi (SCientific, ELectronic and BIological), Business Week already predicted that computer records would soon completely replace paper. We all know that it took over 25 years before paperless operations were accepted and successfully adopted in our daily work.
Recent announcements by Intel and NVIDIA indicate that massively parallel computing with GPUs and Intel Xeon Phi will no longer require passing data via the PCIe bus. The bad news is that these standalone devices are still in the design phase and are not yet available for purchase.
It is always a pleasure to review a text that is easy to read and understand, when targeted to a novice audience. This book was written for business majors at the junior undergraduate level, and not statistics majors. However, it is recommended that readers have a course in introductory statistics before using this book.
The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies
The adage that a supercomputer is a complicated device that turns a compute bound problem into an IO bound problem is becoming ever more apparent in the age of big data. The trick to avoid the painful truth in this adage is to ensure that the application workload is dominated by streaming IO operations.
Discovery of the last neutrino mixing angle — one of Science magazine’s top 10 breakthroughs of the year 2012 — was announced in March 2012, just a few months after the Daya Bay Neutrino Experiment’s first detectors went online in southeast China. Collaborating scientists from China, the United States, the Czech Republic and Russia were thrilled that their experiment was producing more data than expected
The U.S. Department of Energy’s National Energy Research Scientific Computing Center has a straightforward approach to data: When any of the center’s 4,500 users need access to their data, NERSC needs to be able to deliver. It’s an approach that’s worked well for 39 years and helps NERSC’s users annually publish more than 1,500 scientific papers.
The HPC market is entering a kind of perfect storm. For years, HPC architectures have tilted farther and farther away from optimal balance between processor speed, memory access and I/O speed. As successive generations of HPC systems have upped peak processor performance without corresponding advances in per-core memory capacity and speed, the systems have become increasingly compute centric