Making an intensely complex subject understandable to the non-physicist John A. Wass, Ph.D. Well, relatively! I didn’t ask to review this book, but my wonderful contact at No Starch Press (read Geeks Anonymous) sent The Manga Guide to Relativity thinking that the subject may interest me. After sitting on it for too many months, I finally got around to reading a few pages. After 10 pages, I was hooked
Learning from other industries Peter J. Boogaard and Hans Griep The pharmaceutical industry is looking more at holistic approaches to improve the process of bringing new products to market. Adopting these approaches can accelerate product development while lowering operational costs. Quality by Design (QbD) has come relatively late to the pharmaceutical industry
Exascale strategies are all over the map Steve Conway The recent SC11 supercomputing conference in Seattle attracted nearly 12,000 attendees, a record high that reflects the growing vibrancy of the worldwide HPC community. The show revealed some interesting innovations, especially in memory and networking technologies, but the real spotlight was on the globalization of HPC leadership
Charting the best course for rapid approval and favorable review starts with conformity to several agency guidelines Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. Ah, for the good old days! Laboratory results were dutifully recorded in bound, hardcover notebooks. Each page was dated and signed, and numbered consecutively. An error was corrected by crossing out (without obliterating) the original entry, then signed and dated to indicate exactly who made the change, when, and (if appropriate) why. Completed notebooks were stored in dry, fireproof file rooms
Lessons learned from the 10th annual electronic laboratory notebook conference in Barcelona Michael H. Elliott The largest event examining the world of ELN is the annual IQPC ELNs and Advanced Laboratory Solutions conference held September 26-28 in Barcelona, Spain. This three-day event illustrated the evolution of the ELN market, as a majority of attendees have a system in production. This is a significant shift, considering that in 2002 conferees struggled to even define what an ELN should do; the talks were more about concepts than reality
Judicious selection of a logical sequence of designs leads to a model that pays for itself Mark A. Anawis The journalist David Brinkley once said “A successful man is one who can lay a firm foundation with the bricks others have thrown at him.” In science and engineering, a good experimental design is the key to a successful product, as well as understanding a process. A good design creates a useful mathematical model that can be used to understand and optimize a process.
In the past, I have extolled STATISTICA 10 statistical software for its ability to cover just about any type of test the user may need. Now, the product line consists of bits and pieces of the whole for special applications and/or limited needs. The new Data Miner product contains all of the routine and advanced statistical tests, as well as a number of very sophisticated mining routines.
Randy C. Hice Web Exclusive At first, my business trip to Puerto Rico had all of the hallmarks of a roaring disaster. To start, the only airline that could deliver me to the Isla Del Encanto in a reasonable time frame was the redoubtable AirTran Airways, whose precursor company, ValueJet, in 1996, managed to auger a plane into the Everglades with 110 people aboard. So poisoned was the brand name that ValueJet was forced to merge with the smaller AirTran just so passengers would cross the jet way to unknowingly throw the dice one more time.
The future of ELN will require innovative and disruptive thinking Michael H. Elliott Technology convergence — it is happening all around us in the consumer world where smartphones consolidate diaries, calendars, messaging systems and phones into a single platform. In the last two years, mobile devices, such as the iPad, have converged most of the smartphone functionality into a thin, mobile — yet simple to use — computing platform. App stores provide the freedom users need to make individual choices of software solutions. No longer do consumers have to be beholden to monolithic applications that are difficult to install, support and use.
What’s so remarkable about it? Well, for one, it was originally geared to chemical engineers, but is now used by a wide spectrum of chemists. The package is remarkable for its choice of tools, namely experimental design (DOE) and multivariate statistics. It does a good job of both, and the developers have added many new features (this is NOT your father’s version 9.2!).
A look at real GPU options and some issues that still need to be addressed Jacques du Toit While the cost of high performance computing (HPC) has been reducing steadily over recent years, it may still put some people off. The advent of general purpose graphical processing units (GPGPU) has both accelerated the cost reduction and improved the energy efficiency of some HPC installations. The experiences of those who have written GPGPU algorithms, and who know HPC systems (some with GPGPU capability), may help you decide whether this technology is right for you.
Adding three new steps allows GPU and MapReduce to work together Mike Martin A modified version of MapReduce — Google’s patented program for distributed and cluster computing — harnesses the power of graphics processing units (GPU) for large-scale, high-performance applications, claim University of California, Davis computer science researchers. In benchmark performance tests, GPMapReduce increased both speed and efficiency on a GPU cluster, explained UC Davis graduate student Jeff Stuart, who with electrical and computer engineering professor John Owens developed the new approach.
Multiple GPU and hybrid CPU+GPU performance is heavily dependent upon vendor implementation of the PCIe bus Rob Farber GPU technology provides orders of magnitude speedups with a single GPU over a conventional processor. Plugging two or four GPUs into a workstation or computational node can double or quadruple the performance of computational applications and games. Even more performance can be achieved by utilizing the multicore capability of the host processor in concert with the GPUs in a system.
General-purpose computing on graphics processing units — a brief history William L. Weaver, Ph.D. In the mid-1970s, filmmaker George Lucas sought to capture his vision for a space opera called The Star Wars on motion picture film. The difficulty was that the technology did not yet exist to create the vast special effect sequences that were required to tell an ancient story set in a galaxy far, far away. To solve this problem, Lucas acquired space in a vacant warehouse located next to the Van Nuys Airport near Los Angeles, CA and assembled an interdisciplinary team of special effects artists, model makers and engineers that later became known as Industrial Light and Magic (ILM).
While still generally experimental, GPUs have tripled their worldwide footprint in the past two years Steve Conway What a difference two years can make in the fast-paced world of HPC technology adoption — a market considerably less risk-averse than its mainstream IT counterpart. IDC’s 2008 worldwide study on HPC processors revealed that nine percent of HPC sites were using some form of accelerator technology in their installed systems. GPGPUs (henceforth to be called GPUs) shared the accelerator habitat back then with FPGAs, Cell processors and a few rarer species.