Even as CPU power and memory bandwidth march forward, a major bottleneck hampering overall supercomputing performance has presented a significant challenge over the past decade: I/O interconnectivity. The vision behind Intel’s new Omni Scale Fabric is to deliver a platform for the next generation of HPC systems.
The recent PRACE Days conference in Barcelona provided powerful reminders that massive data...
My last blog post described federal government initiatives that have driven data management...
Solving some of the biggest challenges in society, industry and sciences requires dramatic...
Using Powerful GPU-Based Monte Carlo Simulation Engine to Model Larger Systems, Reduce Data Errors, Improve System PrototypingJuly 22, 2014 8:33 am | by Jeffrey Potoff and Loren Schwiebert | Comments
Recently, our research work got a shot in the arm because Wayne State University was the recipient of a complete high-performance compute cluster donated by Silicon Mechanics as part of its 3rd Annual Research Cluster Grant competition. The new HPC cluster gives us some state-of-the-art hardware, which will enhance the development of what we’ve been working on — a novel GPU-Optimized Monte Carlo simulation engine for molecular systems.
How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...
For the past 21 years, the TOP500.org has been ranking supercomputers by their performance on the LINPACK Benchmark. Reported two times a year, the release of the list is anticipated by the industry. As with any such ranking, the top of the list often garners the most attention. However, such emphasis on the top of such a list, would limit one’s understanding of the different supercomputers in the TOP500...
An energy efficient supercomputer with warm water. How cool is that? Enlightenment has long been the ultimate pursuit of artists, philosophers, scientists, theologians and other sentient minds. Whether it is delivering the proof to support their theses, or to investigate a perplexing problem before them, they have poured a vast amount of energy into the situation. Energy has now become the problem.
Internet regulation in the United States is potentially facing a major change. FCC Internet Neutrality rules — also referred to as Net Neutrality rules — currently apply, but thanks to pressure from Internet Service Providers (ISP), legislators and recent court rulings, that might change. You have undoubtedly heard the term Net Neutrality before, but may be at a loss regarding what it means or what its implications are.
Since June of 1993, the Top500 List has been presenting information on the world’s 500 most powerful computer systems. The statistics about these systems have proven to be of substantial interest to computer manufacturers, users and funding authorities. While interest in the list is focused on the computers, less attention is paid to the countries hosting them. Let’s take a look at the Top500 List countries. Who are they?
In the late 90s, I was teaching parallel programming in C using MPI to students. The most important lesson I wanted them to remember is that communication is much more important than computation. The form of the benchmark couldn't be more common: a set of convolutional filters applied to an image, one filter after the other in a pipelined fashion.
Nelson Mandela said: “We must use time wisely and forever realize that the time is always ripe to do right.” Time series data have a temporal order that makes analysis distinctly different from other data analysis. The goal of time series analysis can be divided into characterization or prediction.
LEGO has announced that a female minifigure set, featuring three scientists along with their lab gear, will be released as the next LEGO Ideas set. This “Research Institute” model is an official set of all-female scientist figures — a paleontologist, an astronomer and a chemist — made with regular LEGO minifigures.
In our June 2014 issue, expert contributors share their experience on topics ranging from collaboration in the cloud to meeting new regulatory requirements to time series analysis. The cover story “Recognizing ROI and Innovative Application of High Performance Computing,” by Chirag Dekate, Research Manager, HPC at IDC, provides an update on how IDC’s Innovation Excellence Award Program continues to showcase benefits of investment in HPC.
In conjunction with ISC’14, we will hold a one-day HPC Advisory European Conference Workshop on June 22, 2014. This workshop will focus on HPC productivity, and advanced HPC topics and futures, and will bring together system managers, researchers, developers, computational scientists and industry affiliates to discuss recent developments and future advancements in High-Performance Computing. Our keynote session will feature the SKA Project
I’m pleased to announce that Scientific Computing has launched an HPC User Forum resource site, in cooperation with IDC. The HPC User Forum page is a one-stop destination that offers comprehensive information on the global HPC industry, collected together in one place where it’s easy to locate.
Ever since we started the Google self-driving car project, we’ve been working toward the goal of vehicles that can shoulder the entire burden of driving. Just imagine: You can take a trip downtown at lunchtime without a 20-minute buffer to find parking. Seniors can keep their freedom even if they can’t keep their car keys. And drunk and distracted driving? History.
HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.
On February 26, 2003, the National Institutes of Health released the “Final NIH Statement on Sharing Research Data.” As you’ll be reminded when you visit that link, 2003 was eons ago in “Internet time.” Yet the vision NIH had for the expanded sharing of research data couldn’t have been more prescient. As the Open Government Data site notes, government data is a tremendous resource that can have a positive impact on ...