Flash and fluff destroys project success Randy C. Hice I have a great amount of sympathy for those tasked with selecting vendors and products for complex laboratory automation architectures. The tools available to them are few, and the very nature of the industry revolves around
Have you ever wondered how all of the information is gathered for the large Web databases? After all, in most cases an organization can't afford an army of people to go out, surf, and index the Web. Even if they could, it would take more than a small army to stay current with the existing Web, let alone keep up with its growth! In most cases, this data collecting, which is sometimes called aggregating, is done through the use of spiders...
Synergistic partnership helps AH Marks improve productivity John Boother LIMS (Laboratory Information Management Systems) underpin the infrastructure of academic and commercial laboratories throughout the world. Their basic function is to handle data from sample registration right through to analysis and reporting stages. However, not all laboratories rely solely on a single LIMS
Last month, we took a look at some Windows utilities that can be very useful on a daily basis. This month, let's take a look at another set of utilities that, while you may not need to use them as frequently, are still essential tools for anyone moving data over the Internet or doing Web development. As you might have guessed, I'm talking about the frequently overlooked FTP clients.
It doesn't take Nostradamus to recognize that lack of project sponsors is akin to a slow-moving Albatross circling the stern of a ship taking on water Randy C. Hice There was bad karma in the air, and apparently much more. Hundreds of students and faculty were shoving their way through the doors as though a band of ravenous wolves were nipping at their heels
Improved simulation accuracy and execution time vital in disaster mitigation Per Nyberg & Ilene Carpenter The Earth Sciences have traditionally played a key role in defining the requirements of high performance computing (HPC) platforms. As early HPC adopters, scientists studying the weather and climate pushed HPC systems to their limits
Is the tail wagging the dog? David Hessler A common complaint among laboratory software purchasers today is that commercially available product choices have serious limitations, requiring the users to adjust their workflows and practices, even database choices. We find that this is a source of ongoing frustration and dissatisfaction in our
The hunt for the right commodity interconnect Scott Studham A cluster of workstations may reach the vaunted classification of "supercomputer" when you can run tightly coupled
Frequency analysis with the Hilbert-Huang Transform Bill Weaver, Ph.D We start to look at vectors and their importance to motion analysis around chapter three of the freshman physics text. By that time, the related concepts of position, displacement, velocity and acceleration have been introduced, and we start with
The prospects for another Earth-Simulator-like event in 2005 are very good Horst D. Simon Since 1993, the TOP500 list of the world's fastest supercomputers has been released twice a year. The publication of the 23rd list a few weeks ago during the International Supercomputer Conference in Heidelberg, Germany, was a much-anticipated and closely watched event
Vance V. Kershner, President, LabWare, Inc. Being a LIMS vendor for over ten years offers a unique vantage point to observe what makes the industry tick and to understand what works and what doesn't. As much as things change, so do they stay the same from a principle point of view.
Simple method reduces noise in measurement platforms Tom Downey Pharmaceutical and biotechnology companies today are investing billions of dollars collecting, storing, and analyzing data from new genomic and proteomic technologies. It is expected that analysis of this data will result in the discovery of new biomarkers and improve the
SigmaStat 3.0 and SigmaPlot are bundled for the analysis and graphic presentation of research data. While the two components of the bundle may be used as single packages or together, they expressively complement each others’ work.
nQuery Advisor is software specifically for power/sample size calculations. It assists the researcher in determining the data variation and a desired or specified effect size necessary to calculate these in a user-friendly format. This release contains many analyses, tabular and formatting modes.
Design-Expert software is a specialized design of experiments package that reveals vital factors in products or processes and the interactions between them. It maximizes desirability for dozens of responses simultaneously and displays the resulting sweet spot.