Emerging applications embrace the decentralized nature of R&D Matthew R. Shanahan In science-based industries, R&D is distributed, decentralized and ever-changing, and so is its data. Driven by automation and informatics, scientists are faced with an unprecedented challenge of accessing data across a fluid and virtual landscape of experimental methods, diverse applications and databases. The fragmentation and scale of data makes integration of laboratory results and efforts difficult to achieve with traditional technologies
Disaster is in the eyes of the beholder Randy C. Hice Disaster is a relative term. To a high school senior, a freakish case of acne prior to the prom is a major disaster. To the IT professional, a wayward rat biting through a cable can invoke disaster. But I’m sure neither of these individuals would ever slide the scale of definition to the disaster setting if they had lived through a natural catastrophe. Disaster recovery is in the news again; too soon for some, and not soon enough for others
A sound case requires a detailed review of processes and practices Mark Fish Implementing a laboratory information management system (LIMS) or upgrading an existing one can require a significant investment and must compete with other IT initiatives for funding. To gain approval for such a major project requires a compelling proposal that both justifies the costs and demonstrates the value of LIMS to the organization.
A look at international standardization efforts in the scientific arena Professor Antony N. Davies, Dr. Maren Fiege and Dr. Peter Lampen The year 2005 has seen great strides made in the field of standardization of scientific data formats. The ASTM International and IUPAC projects on the Analytical Information Markup Language (AnIML) have been tackling the complexities of various analytical data models. This has included testing the draft schema with which the 2004 prototype was built against ever more complicated data structures found in the analytical laboratory
A LIMS survey with an attitude Randy C. Hice More than a year ago, I sat at my desk darkly ranting about the various LIMS surveys, awards and ratings foisted upon the consumer market and the fact that my visibility in the industry has somehow transformed me into a magnet for a variety of marketing shills, scams and thinly-veiled exploitative Trojan Horses from all corners of the industry. Just last week, I was notified of a "LIMS Selection Guide"
With a new research facility, the Gladstone Institutes seizes the opportunity to plan scientific computing solutions from the ground up Reginald L. Drakeford, Sr. For 25 years, The J. David Gladstone Institutes was spread across several century-old buildings at the sprawling San Francisco General Hospital campus. Then in 1999, a golden opportunity for IT planning emerged: Gladstone, renowned for its basic research into the causes of cardiovascular, virological and neurological diseases, made the decision to build a new research facility from the ground up. The IT team would have the chance to work with architects, engineers and contractors on every detail
A zero-to-sixty look at pivotal growth areas Phil Fraher As more companies take advantage of high performance computing (HPC), two trends are emerging: visualization and the evolution of 64-bit software. Following is an overview of why visualization will be a necessary next step for data analysis, and software will need to evolve to support increasing power demands. Most companies still rely on historical data to help them make the right business decisions. Technically, historical review is the first of four data analysis frontiers, followed by visualization, predictive analytics
To wrap up our programming topics mini-arc, I'd like to invite you to investigate a relatively new interpreted programming language, Ruby. Unlike many other scripting languages, it was designed from the ground up to be object-oriented. The impact of this object-oriented heritage will be obvious as you examine its functionality. Designed by Yukihiro Matsumoto, a.k.a. 'Matz', in Japan, it was first released on an unsuspecting world in 1995.
Low-power self-organizing data networking Bill Weaver, Ph.D. I recently purchased a 2006 Toyota Corolla with the goal of reducing the frequency of trips to my local filling station. In time, I may begin to miss my close friends behind the counter, but one thing I was startled to miss right away was the throttle cable to the Corolla's VVT-i engine. Similar to the "fly-by-wire" transformation experienced by the U.S. Air Force, "drive-by-wire" technology has trickled down to entry-level automobiles
Ancient Egyptian child introduced to the future of visualization Jennifer A. Miller, Managing Editor Allow me to redirect your attention from the norm, if only for a moment, as I take you on a voyage back in time over 2000 years to the land of ancient Egypt. The year is 1 A.D. A couple mourns the recent passing of their young daughter, Sherit, whose life has been prematurely terminated after suffering a lethal early-childhood disease. And so, the ritual begins. Paying tribute to the first known mummy, Osiris, the child's body is to be physically preserved, so that her "ka," or "lifeforce," can find sustenance
As with most things, the more familiar are what we usually feel comfortable dealing with on a daily basis. In this case, although there are probably thousands of software packages pertinent to data analysis worldwide, we tend to concentrate on the ones from the U.S. It is therefore a nice change to see what else is out there and review a package from our colleagues in Britain, Australia and New Zealand.
Simulating 3-D flames with unmatched accuracy Horst Simon Remember when "sticker shock" referred to the price of a new car — not the cost of filling the gas tank? Given rising oil prices, getting the most efficiency out of a gallon of gas has implications ranging from personal finances to national policy. Investing in combustion research at national laboratories and universities
Completing biocomputing simulations in a timely manner improves research quality and allows more focused and valid results Eric Pitcher, Ph.D. Tulane University is home to the Center for Computational Science (CCS), a unique facility designed to provide computational resources for research projects across many disciplines. The Center provides an infrastructure for investigators interested in computational science to exchange ideas, produce research and establish new collaborations. One of these collaborative efforts involves a team of researchers performing computational simulations of multi-scale models
Remarkable growth in line with Moore's Law Larry Jones High performance computing (HPC) has been around for as long as the computer. While HPC has been a part of government and research institutes — such as NASA, the Department of Energy, and the Department of Defense
Complex computing operations could be greatly accelerated through massive parallel processing in a quantum computer. The smallest units of information are known as quantum bits, which could be realized using atoms or molecules, if one can manipulate their position, quantum state…