Subscribe to Scientific Computing Articles
View Sample

FREE Email Newsletter


1st Innovation Excellence Awards Presented

October 12, 2011 8:49 am | by Steve Conway | Comments

International Data Corporation (IDC) announced the first recipients of the new HPC Innovation Excellence Award at the ISC’11 International Supercomputing Conference Steve Conway The HPC Innovation Excellence Award recognizes noteworthy achievements by users of high performance computing (HPC) technologies.


ELN Authentication

October 10, 2011 11:05 am | by Michael H. Elliott | Comments

    Navigating a Sea of Options Michael H. Elliott In an increasingly electronic R&D world, data must be stored securely for privacy, intellectual property protection, quality, regulatory, and for competitive reasons. As organizations move from controlled paper notebooks to an open and collaborative ELN work environment, there are record management risks that must be addressed. Valuable intellectual property can be subject to theft, and databases are susceptible to data-altering malware and hackers. An organization must have consistent, audited and proven record management practices that are enforced across the entire spectrum of their R&D operations.


Automating Data Management while Facilitating Regulatory Compliance

October 10, 2011 9:59 am | by Paul Pearce, Ph.D., Colin Thurston | Comments

Nova Biologicals implements an integrated water, environmental and pharma LIMS/DMS Paul Pearce, Ph.D., Colin Thurston Nova Biologicals is a full-service, National Environmental Laboratory Accreditation Conference (NELAC)-accredited laboratory in Texas, providing testing and consulting services to the water, medical device, pharmaceutical, nutraceutical and food industries globally. Water testing makes up 53 percent of Nova’s total revenue, and the laboratory specializes in microbiological, chemical and toxicological testing of drinking water and wastewater samples. A team of dedicated scientists provides comprehensive diagnostic testing of specimens for the presence of infectious disease organisms and water testing under the Federal Safe Drinking Water Act.


Blueprint for Innovation

October 10, 2011 9:44 am | by Sandy Weinberg, Ph.D., Ronald Fuqua, Ph.D. | Comments

Encouraging computerized medical device invention Sandy Weinberg, Ph.D., Ronald Fuqua, Ph.D. A patient swallows a computerized capsule, providing his physician with a series of images of the gastrointestinal tract. Another patient accesses the computer control on her wheelchair, which raises her to a standing position and follows a carefully designed exercise program to keep her legs from atrophying. A computerized “lab on a chip” provides toxicologists with a complete analysis series from a single sample. These and other computerized medical devices have two important characteristics in common: they are all innovations developed by entrepreneurs in a single country, and they represent the success stories of that country’s policies for supporting and encouraging innovation.


Origin Automation

October 6, 2011 10:38 am | by John R. Joyce, Ph.D. | Comments

    An extremely flexible tool for acquiring, processing and displaying data John R. Joyce, Ph.D. Origin 8.5.1 is a full-featured data analysis and graphing package that has been the target of previous reviews in Scientific Computing. Here, we will take a more in-depth look at the many automation features to be found in both Origin and OriginPro. These include several different ways of automating its internal processes, ways for it to control external processes and ways for external processes to control it. In the following text, we will examine these capabilities and some of the ways in which they can be used, as well as constructing examples illustrating some of the approaches to go about doing it.


Actuaries and Epidemiologists: Same Data, Different Interpretations

October 5, 2011 10:14 am | by Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. | Comments

Interaction of two computing modeling fields provides critical disease mitigation tools Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. While neither may qualify as “the world’s oldest profession,” at least in risqué jokes, both professional actuaries and epidemiologists have long histories, with an interesting modern intersection. The Old Testament describes a variety of diseases in great detail: arguably Moses was, in addition to his other skills, an effective epidemiologist. And, as for actuaries, didn’t Noah count off the animals in his ark two-by-two? However, it is in the modern analysis of health trends that these two modern professions emerge to provide a scientific basis for research and application.


Implementing Electronic Lab Notebooks Part 5

October 5, 2011 8:04 am | by Bennett Lass Ph.D., PMP | Comments

Systems Integration Bennett Lass Ph.D., PMP Web Exclusive This is the fifth article in a series on best practices in Electronic Lab Notebook (ELN) implementation. This article discusses the fourth core area: System Integration.


HPC Democratization

October 4, 2011 10:46 am | by Steve Conway, IDC Research VP, HPC | Comments

Be Careful What You Wish For Steve Conway, IDC Research VP, HPC In 1995, the global market for high performance computing (HPC) servers, a.k.a. supercomputers, was worth about $2 billion. By 2010, that figure nearly quintupled to $9.5 billion, thanks to the rise of HPC clusters based on commercial, off-the-shelf (COTS) technologies.


Competing with C++ and Java

October 4, 2011 10:13 am | by Rob Farber | Comments

    Everyone’s a winner in the race for a common application language that can support both x86 and massively parallel hardware Rob Farber Commercial and research projects must now have parallel applications to compete for customer and research dollars. This translates into pressure on software development efforts that have to control costs while supporting a range of rapidly evolving parallel hardware platforms. What is needed is a common programming language that developers can use to create parallel applications with a single source tree that can run on current and future parallel hardware.


What Constitutes a Good Forensics LIMS?

October 4, 2011 8:53 am | by John R. Joyce, Ph.D. | Comments

Proper selection requires careful review of needs and processes John R. Joyce, Ph.D. While it is common for users in various laboratories and industries to feel that their processes are unique, in many ways, they all have common needs. Similarly, in many respects, all laboratory information management systems (LIMS) are alike, or at least they should be. All must perform basic functions, such as track the users entering data, track the samples arriving at the laboratory and their processing through it, and generate analysis reports, while maintaining data integrity throughout the whole process.


Considerations for Software Expansions and Upgrades

September 30, 2011 11:21 am | by Peter J. Boogaard | Comments

Before you decide to rock the boat, several key decision-making steps can help to ensure a smooth and successful upgrade Peter J. Boogaard The upside of upgrading the IT infrastructure will give many organizations the ability to eliminate barriers to enable cross-functional collaboration between research, development, quality assurance and manufacturing. Standardizing workflows and operating procedures and applying best industry practices throughout operations also are key drivers. Quick advantage occurs when the implementation is fast and when it results in strategic value. This article will highlight key decision-making steps to be considered when upgrading your software.


Maple 15: How do they do it?!

September 29, 2011 10:31 am | by John A. Wass, Ph.D. | Comments

For those new to this software, perhaps a little background is in order. Maple is mathematical software that is constantly being improved as to breadth of the calculation routines, optimality of the algorithms, speed of computation, and ease-of-use. The last is one of the most useful features, as the new user can quickly come up to speed by testing the menu items, going through the tutorials and reading the pertinent sections of the manuals


Data Traffic is about to Explode: Don’t Get Caught in the Blast

September 28, 2011 11:13 am | by Dan Joe Barry, Napatech | Comments

Using what you have in a smarter way Dan Joe Barry, Napatech Web Exclusive If you like theory, then you’ll be interested to know that many are predicting that data center traffic is set to sharply increase. As cloud computing centralizes, more computing resources and more devices, such as mobile phones, tablets and TVs, are being used to exchange data.


Seven Tips for Staying on the Right Side of the Ethical Line in Tough Times and Beyond

September 23, 2011 9:34 am | by Christopher Bauer, Ph.D. | Comments

Avoiding the consequences of cutting corners Christopher Bauer, Ph.D.  Web Exclusive Times are tough in business right now, certainly including laboratory informatics, and any sane business person is trying to cut corners in any way possible. Scaling back on your attention to ethics, however, can have catastrophic consequences. Between the possible fines, legal fees and reputational damage to you and your company, you could lose anything from thousands to millions of dollars as well as your career or business. No matter how tough the times are, that kind of risk is simply not worth taking. Unfortunately, though, tough times can easily make it seem worth trying to cut ethical corners if it looks like there might be some financial gain from it.


How Data Tiering Can Lower IT Storage Costs

September 20, 2011 10:14 am | by Will McGrath | Comments

Solving the problems of “big” data growth Will McGrath Web Exclusive The explosion of data growth created by next-gen instruments has caused tremendous challenges in handling and storing those files. While growth in structured and semi-structured data — like e-mail and databases — continues to grow, it is really unstructured “big” data growth that is causing the biggest problems. This is true in verticals like oil and gas with upstream seismic and interpretation applications or in life sciences with a number of newer instruments being introduced for electron microscopy, high content screening / high throughput screening and flow cytometry — or especially, next generation sequencing (NGS).



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.