web exclusive With the automotive industry’s increasing concern about emissions and the volatile price of gasoline, electric vehicles with lithium-ion batteries are emerging as a viable and energy-efficient transportation option. To accelerate the pace of battery innovation, the U.S. Department of Energy’s (DOE’s) Vehicle Technologies Program is challenging the automotive industry to reduce the production cost of high-power batteries by 70 percent
Safety, consistency, regulatory compliance, efficiency and profitability are just some of the issues confronting senior executives in the oil and gas industry. Managing this complexity, often in some of the world’s harshest conditions, is a constant challenge
In late 2011, I wrote in Scientific Computing about the increasing collisions between commercial big data and data-intensive HPC. In the short time since then, the dimensions of these encounters have become clearer. This might qualify as the most serious rapprochement since business computing first branched off from scientific-technical computing in the early 1960s
Ongoing advances in scientific technology have left us confronted with the task of discovering scientific knowledge from enormous amounts of data generated in the life and environmental sciences, physics and sociology, business and medicine. In biology, these data are frequently combined from many sources. Further, addressing large-scale interdisciplinary problems requires diverse research teams. Researchers in biology, particularly in computational biology and systems biology, already have experienced the shift from the one-scientist-one-project paradigm to cross-disciplinary collaborative research, and the need for an infrastructure and community that can support it all
Technology and computational evangelists quickly learn that human psychology is a key component of any project roadmap. The truth in the quip that it took one social genius plus 500,000 scientists and engineers to put a man on the moon can be appreciated when one observes the social issues that surface in technical discussions within even small groups of people. As the name implies, data-intensive computing requires large amounts of data
Over the last 12 months, the market for ELN technology has evolved from one dominated by sales to a handful of large biotechnology and pharmaceutical companies, to a diverse landscape of markets and organizations. This has caused a sea-change in both how suppliers approach the market and the features they offer
We have enjoyed a century of innovation and productivity since Henry Ford and the Ford Motor Company developed the assembly line manufacturing process over the years of 1908 to 1915. While originally proposed by Adam Smith in his work The Wealth of Nations, published in 1776, Ford popularized the method of progressive assembly
Extreme cost pressure, consumers demanding reliability and loss of market share were key drivers for the electronic and car industry to adopt Juran’s1successful QbD theory. It is proven that his theory to adopt continuous improvement strategies to decrease variability resulted in significantly better products and financial performance
Five years ago, NVIDIA disrupted “business as usual” in the high performance computing industry with the release of CUDA in February 2007. The September 2011 announcement by the Texas Advanced Computer Center (TACC) of a MIC-based (Many Integrated Core) Stampede supercomputer shows that Intel has decided to compete against graphics processing units (GPU) and other computer architectures in the “leadership class” HPC market space with the Knights Corner (KNC) many-core processor chip
Laboratories working in the pharmaceutical industry in the areas of R&D and quality control find themselves increasingly having to cope with conflicting demands — tougher regulatory requirements and harsher economic realities. In order to meet these demands, new ways of dealing with process, data and system management are necessary.
Everyone working in a regulatory lab these days knows that everything must be validated. And that “everything” includes analytical methods, instruments and also the IT systems
The Texas Advanced Computing Center (TACC) announcement of their forthcoming 10 petaflop/s Stampede supercomputer utilizing the Intel Many Integrated Core (MIC) architecture demonstrates a substantial commitment by Intel to design hardware to accelerate highly-parallel workloads
Bioanalytical laboratories play a crucial role in the development of pharmaceutical drug products. Labs in large biopharmaceutical companies typically partner with other internal groups, such as pre-clinical toxicology and clinical pharmacology, to support their analytical requirements.
In the fall of 2008, software developers Jeff Atwood and Joel Spolsky launched an online forum encouraging programmers to post questions that were answered by fellow visitors to the site. Called Stack Overflow, it was not the first programmer’s forum to elicit expert advice from visitors, however, it was one of the first to be totally free to its users
Making an intensely complex subject understandable to the non-physicist John A. Wass, Ph.D. Well, relatively! I didn’t ask to review this book, but my wonderful contact at No Starch Press (read Geeks Anonymous) sent The Manga Guide to Relativity thinking that the subject may interest me. After sitting on it for too many months, I finally got around to reading a few pages. After 10 pages, I was hooked