Subscribe to Scientific Computing Articles

The New Frontier of Biologics ELN

September 1, 2013 5:36 pm | by Michael H. Elliott | Comments

The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies


Big Data Requires Scalable Storage Bandwidth

August 15, 2013 9:00 am | by Rob Farber | Comments

The adage that a supercomputer is a complicated device that turns a compute bound problem into an IO bound problem is becoming ever more apparent in the age of big data. The trick to avoid the painful truth in this adage is to ensure that the application workload is dominated by streaming IO operations.


Global Filesystem Delivers High Bandwidth, High Volume, Low Response Time

August 15, 2013 8:55 am | by Jon Bashor, LBNL | Comments

Discovery of the last neutrino mixing angle — one of Science magazine’s top 10 breakthroughs of the year 2012 — was announced in March 2012, just a few months after the Daya Bay Neutrino Experiment’s first detectors went online in southeast China. Collaborating scientists from China, the United States, the Czech Republic and Russia were thrilled that their experiment was producing more data than expected


At NERSC, Scalable Storage for Big Data Leads to Big Science Breakthroughs

August 15, 2013 8:49 am | by Jon Bashor | Comments

The U.S. Department of Energy’s National Energy Research Scientific Computing Center has a straightforward approach to data: When any of the center’s 4,500 users need access to their data, NERSC needs to be able to deliver. It’s an approach that’s worked well for 39 years and helps NERSC’s users annually publish more than 1,500 scientific papers.


HPC Architectures Begin Long-Term Shift Away from Compute Centrism

August 15, 2013 8:43 am | by Steve Conway, IDC | Comments

The HPC market is entering a kind of perfect storm. For years, HPC architectures have tilted farther and farther away from optimal balance between processor speed, memory access and I/O speed. As successive generations of HPC systems have upped peak processor performance without corresponding advances in per-core memory capacity and speed, the systems have become increasingly compute centric


Scalable Storage Solutions for Applied Big Data

June 20, 2013 4:35 pm | by Rob Farber | Comments

Considered in isolation, big data is nothing more than job security for tech vendors and system managers. Only through application can the value of big data be realized. For example, scraping the Internet for Web sites will clearly generate a big data set.


Big Data: It’s not Just Big

June 20, 2013 4:30 pm | by David Skinner | Comments

There is nothing new about data. It has existed as long as measurement itself. Many in the scientific community are then asking, “What is Big Data? And what’s new about it?” What’s new is our relationship to data. Attention to size, whether on tera-, peta-, or exabyte boundaries, is indeed important, but that’s not what Big Data is about.


Supercomputing: The Reality and Vision

June 20, 2013 4:27 pm | by Andrew Jones | Comments

The door to the research group tea-room swung open and Rebecca marched in. She smiled at the team members already there and announced joyfully, “the review committee approved the business case — we can buy our supercomputer!” One person cheered, another grinned widely. Most simply grunted “good,” “about time,” “well done” or similar minimal shows of enthusiasm — not truly believing they had been successful yet.


Texas Unleashes Stampede for Science

June 20, 2013 4:21 pm | by Aaron Dubrow | Comments

You hear it before you see it — a roar like a factory in full production. But instead of cars or washing machines, this factory produces scientific knowledge. Stampede, the newest supercomputer at the Texas Advanced Computing Center (TACC) and one of the most advanced scientific research instruments in the world, fills aisle after aisle of a new 11,000-square-foot data center at The University of Texas at Austin.


MapleSim 6: Advanced Systems Level Modeling

June 20, 2013 3:59 pm | by John A. Wass, Ph.D. | Comments

This new version of the modeling software based upon the Maple mathematical system, the open-standard Modelica modeling language, and a number of highly advanced design algorithms, integrates a very pleasing number of major upgrades. It was specifically designed as an environment to create complex multi-domain physical systems and simulate their behavior


Big Data Helps Lift HPC Market to Record Revenues

June 20, 2013 3:47 pm | by Steve Conway | Comments

Anyone remember the old Shake’n’Bake commercials where the parents exclaim over the breaded chicken at dinner and the child chimes in, “And I helped!”? In 2012, record-setting HPC server revenue was the tasty chicken and the little helper was Big Data.


ISC Offers Mid-Year Supercomputing Roundup for Users, Vendors

June 4, 2013 9:43 am | by ISC | Comments

This year’s International Supercomputing Conference (ISC’13) will offer something for nearly everyone in the high performance computing (HPC) community.  Now in its 28th year, ISC is recognized as the premier HPC conference and exhibition for Europe, but one that attracts all the major vendors and users from around the world.


Bill Dally on Future Challenges of Large-Scale Computing

May 6, 2013 2:01 pm | Comments

Bill Dally, who became NVIDIA’s Chief Scientist and Senior Vice President of Research after leading the computer science department at Stanford University, will discuss “Future Challenges of Large-Scale Computing” as the conference keynote address at the 2013 International Supercomputing Conference. The ISC’13 Communications Team recently caught up with him.


Forensics Fiasco

April 15, 2013 3:07 pm | by Randy C. Hice | Comments

The tragic web of circumstances surrounding the “Blade Runner” Oscar Pistorius and the death of his girlfriend, Reeve Steenkamp, invoke distant comparisons to the “Trial of the Century” (TTOTC) involving O.J. Simpson. Of course, those circumstances are markedly different, but in the final analysis, both involved a high-profile athlete and accusations of murder.


IS0 17025: A Challenge and a Best Practice for Laboratories

April 15, 2013 1:49 pm | by Colin Thurston, Thermo Fisher Scientific | Comments

What if ISO 17025 wasn’t a standard, but instead was a technology that automated laboratory best practices? What if following the regulations to the letter — often without even thinking about it — actually led to better overall business performance? In that case, it’s likely every business would aggressively deploy that technology.



You may login with either your assigned username or your e-mail address.
The password field is case sensitive.