Advertisement
Articles
Subscribe to Scientific Computing Articles

Of SuperHEROES and Superpowers in Supercomputing

November 1, 2011 1:18 pm | by CE Stevens | Comments

Last June, in the midst of a nation reeling from the most devastating natural disasters in its nearly 3,000-year history, the high performance computing (HPC) industry quaked in its own surprise with the debut of the newest leader on the Top 500 list of the world's fastest supercomputer — the “K-Computer” at the Advanced Institute for Computation Sciences (AICS) at RIKEN Center in Kobe, Japan.

TOPICS:

The Art of Science — Visualize the Possibilities

November 1, 2011 1:15 pm | by CE Stevens | Comments

Kelly Gaither is a major driving force in HPC visualization, development of large “superdisplays” comprised of large, tiled viz-walls in dealing with large data and parallel systems. As Director of Visualization at the Texas Advanced Computing Center (TACC), she currently hosts one of the world's largest scientific visualization “SciVis” systems.  

TOPICS:

Implementing Electronic Lab Notebooks Part 6

October 24, 2011 9:21 am | by Bennett Lass Ph.D., PMP | Comments

Research Management Bennett Lass Ph.D., PMP Web Exclusive This is the sixth and final article in a series on best practices in Electronic Lab Notebook (ELN) implementation. This article discusses the fifth and last core area: Research Management.

TOPICS:
Advertisement

New Algorithm May Help Data Centers Better Control Power Costs

October 17, 2011 10:25 am | by Mike Martin | Comments

Tackling green data center development challenges Mike Martin On the ground or in the cloud, energy consumption can pose costly dilemmas to data center operators looking to maximize revenue and minimize expense. To keep power costs down and paying clients happy, a three-person international research team has developed — and tested — a straightforward yet novel algorithm that optimizes server operations by balancing power with performance.

TOPICS:

Could Power & Cooling Costs Spur a Scientific Migration?

October 17, 2011 9:58 am | by Steve Conway | Comments

Recent research shows HPC sites plan expansion despite growing concerns Steve Conway A decade ago, power and cooling didn’t make it onto the top 10 list of issues HPC data centers said they were facing. Today, power and cooling consistently ranks among data center managers’ top two or three challenges. What’s changed?

TOPICS:

Cool NERSC Experiment Really Pays Off in Energy Savings

October 17, 2011 9:15 am | by Brent Draney | Comments

Custom cooling distribution unit built on commodity hardware delivers energy and space savings Brent Draney The U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC) is one of the largest facilities in the world devoted to providing computing resources and expertise for basic science research to nearly 4,000 researchers from around the globe. To facilitate this research, the center houses a range of HPC systems — including a new 1,120 node system that serves as a combined high performance computing cluster and a scientific cloud computing testbed. The system was installed last year to replace two existing clusters and to support an American Recovery and Reinvestment Act project, called Magellan, that explores whether a cloud computing model could benefit needs of scientists.

TOPICS:

Exciting Times in Africa

October 15, 2011 1:13 pm | by CE Stevens | Comments

Happy Sithole is pioneering all aspects of research and technology frontiers on behalf of South Africa and across the continent. Happy has been integral to numerous African “firsts,” beginning with the inauguration of South Africa's Centre for High Performance Computing (CHPC) in 2007, which featured the first Top 500 system listing for Africa.

TOPICS:

Power a Major Hurdle on Road to Exascale

October 14, 2011 10:47 am | by John Kirkley | Comments

Unless some groundbreaking solutions are forthcoming, exascale computing may remain more fancy than fa ct John Kirkley When the Defense Advanced Research Projects Agency (DARPA) issued the report “Exascale Computing Study: Technology Challenges in Achieving Exascale Systems”1 on September 28, 2008, it sent shock waves through the high performance computing (HPC) community. The report flatly stated that current technology trends were “insufficient” to achieve exascale-level systems in the next five to 10 years. The biggest stumbling block? Power.

TOPICS:
Advertisement

Two Megawatts of Computing Power in a Ford Fiesta?

October 14, 2011 10:04 am | by Phil E. Tuma | Comments

Using immersion cooling to reach the next level of power density and efficiency Phil E. Tuma Progress in leadership-class computing is being hindered by the limitations of conventional air cooling technology. Multicore chip architectures, faster memory and increases in parallelism have meant an increase in the amount of computational power that must be devoted to communication. While evolving technologies such as 3-D packaging, low-loss materials and improved Z-axis and optical interconnect will play an important role in increasing off-chip and inter-node bandwidth, decreasing signal path length through increased packaging density remains a tried-and-true strategy.

TOPICS:

What You Should Know about State-of-the-art Data Storage Regulations

October 13, 2011 8:47 am | by Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. | Comments

A composite look at four laws, descisions and quidelines related to pharmaceutical data Sandy Weinberg, Ph.D. and Ronald Fuqua, Ph.D. In the U.S. pharmaceutical industries, the collection, storage, mining and analysis of data are subject to a number of disjointed, uncoordinated and occasionally contradictory regulatory restrictions. Pharmaceutical data falls into two general categories, each with differing regulatory oversight and guidelines. In the developmental process, the clinical data that describes tests of product safety and efficacy falls under the purview of the U.S. Food and Drug Administration (FDA).

TOPICS:

Perfecting High Performance Storage

October 13, 2011 8:02 am | by Mike May | Comments

Today’s technology will improve tomorrow’s computer memory Mike May Although advances in floating point operations per second (FLOPS) often take center stage in high performance computing, faster computation cannot keep forging ahead without equally improved data-storage capabilities. The question is: What technology will spawn tomorrow’s best memory?

TOPICS:

For Storing Web 3.0, HBase has the Edge

October 12, 2011 9:59 am | by Mike Martin | Comments

A storage system modeled after Google’s BigTable has the edge in data management for next generation Internet and cloud computing users, claim researchers at the University of Texas – Pan American (UTPA) in Edinburg. In tests designed to find the best storage technologies for Web 3.0 — also known as the Semantic Web — Apache’s Hadoop database, HBase, out-performed MySQL Cluster

TOPICS:

ELN Authentication

October 10, 2011 11:05 am | by Michael H. Elliott | Comments

    Navigating a Sea of Options Michael H. Elliott In an increasingly electronic R&D world, data must be stored securely for privacy, intellectual property protection, quality, regulatory, and for competitive reasons. As organizations move from controlled paper notebooks to an open and collaborative ELN work environment, there are record management risks that must be addressed. Valuable intellectual property can be subject to theft, and databases are susceptible to data-altering malware and hackers. An organization must have consistent, audited and proven record management practices that are enforced across the entire spectrum of their R&D operations.

TOPICS:

Automating Data Management while Facilitating Regulatory Compliance

October 10, 2011 9:59 am | by Paul Pearce, Ph.D., Colin Thurston | Comments

Nova Biologicals implements an integrated water, environmental and pharma LIMS/DMS Paul Pearce, Ph.D., Colin Thurston Nova Biologicals is a full-service, National Environmental Laboratory Accreditation Conference (NELAC)-accredited laboratory in Texas, providing testing and consulting services to the water, medical device, pharmaceutical, nutraceutical and food industries globally. Water testing makes up 53 percent of Nova’s total revenue, and the laboratory specializes in microbiological, chemical and toxicological testing of drinking water and wastewater samples. A team of dedicated scientists provides comprehensive diagnostic testing of specimens for the presence of infectious disease organisms and water testing under the Federal Safe Drinking Water Act.

TOPICS:

Blueprint for Innovation

October 10, 2011 9:44 am | by Sandy Weinberg, Ph.D., Ronald Fuqua, Ph.D. | Comments

Encouraging computerized medical device invention Sandy Weinberg, Ph.D., Ronald Fuqua, Ph.D. A patient swallows a computerized capsule, providing his physician with a series of images of the gastrointestinal tract. Another patient accesses the computer control on her wheelchair, which raises her to a standing position and follows a carefully designed exercise program to keep her legs from atrophying. A computerized “lab on a chip” provides toxicologists with a complete analysis series from a single sample. These and other computerized medical devices have two important characteristics in common: they are all innovations developed by entrepreneurs in a single country, and they represent the success stories of that country’s policies for supporting and encouraging innovation.

TOPICS:

Pages

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading