Considered in isolation, big data is nothing more than job security for tech vendors and system managers. Only through application can the value of big data be realized. For example, scraping the Internet for Web sites will clearly generate a big data set.
There is nothing new about data. It has existed as long as measurement itself. Many in the scientific community are then asking, “What is Big Data? And what’s new about it?” What’s new is our relationship to data. Attention to size, whether on tera-, peta-, or exabyte boundaries, is indeed important, but that’s not what Big Data is about.
The door to the research group tea-room swung open and Rebecca marched in. She smiled at the team members already there and announced joyfully, “the review committee approved the business case — we can buy our supercomputer!” One person cheered, another grinned widely. Most simply grunted “good,” “about time,” “well done” or similar minimal shows of enthusiasm — not truly believing they had been successful yet.
You hear it before you see it — a roar like a factory in full production. But instead of cars or washing machines, this factory produces scientific knowledge. Stampede, the newest supercomputer at the Texas Advanced Computing Center (TACC) and one of the most advanced scientific research instruments in the world, fills aisle after aisle of a new 11,000-square-foot data center at The University of Texas at Austin.
This new version of the modeling software based upon the Maple mathematical system, the open-standard Modelica modeling language, and a number of highly advanced design algorithms, integrates a very pleasing number of major upgrades. It was specifically designed as an environment to create complex multi-domain physical systems and simulate their behavior
Anyone remember the old Shake’n’Bake commercials where the parents exclaim over the breaded chicken at dinner and the child chimes in, “And I helped!”? In 2012, record-setting HPC server revenue was the tasty chicken and the little helper was Big Data.
This year’s International Supercomputing Conference (ISC’13) will offer something for nearly everyone in the high performance computing (HPC) community. Now in its 28th year, ISC is recognized as the premier HPC conference and exhibition for Europe, but one that attracts all the major vendors and users from around the world.
Bill Dally, who became NVIDIA’s Chief Scientist and Senior Vice President of Research after leading the computer science department at Stanford University, will discuss “Future Challenges of Large-Scale Computing” as the conference keynote address at the 2013 International Supercomputing Conference. The ISC’13 Communications Team recently caught up with him.
The tragic web of circumstances surrounding the “Blade Runner” Oscar Pistorius and the death of his girlfriend, Reeve Steenkamp, invoke distant comparisons to the “Trial of the Century” (TTOTC) involving O.J. Simpson. Of course, those circumstances are markedly different, but in the final analysis, both involved a high-profile athlete and accusations of murder.
What if ISO 17025 wasn’t a standard, but instead was a technology that automated laboratory best practices? What if following the regulations to the letter — often without even thinking about it — actually led to better overall business performance? In that case, it’s likely every business would aggressively deploy that technology.
Jack Dongarra is a popular speaker at the International Supercomputing Conference (ISC), held each year in Germany. In recognition of his significant contributions to the conference over the years, Dongarra has been named an ISC Fellow. As a lead-in to ISC’13 to be held June 16 to 20 in Leipzig, the ISC’13 communications team posed a few questions to Dongarra on his role in the TOP500 List and the current state of HPC.
Stephen S. Pawlowski is Intel’s Senior Fellow and Chief Technology Officer for the Datacenter & Connected Systems Group (DCSG) and General Manager for the Architecture Group & DCSG Pathfinding. On June 18, Steve will be keynoting at the International Supercomputing Conference (ISC’13) and the topic of his talk is Moore’ Law. Here’s a Q&A session between Steve and ISC.
The ability for well-heeled individuals to carry a terabyte of information on the same keychains holding their car keys paints a dramatic picture of the penetration solid-state drive (SSD) storage is going to make in the consumer space in 2013. In contrast to double-digit SSD product growth, the latest quarterly reports from the two major hard disk manufacturers show that demand for spinning disk technology remains flat.
An area of accelerating market growth for electronic laboratory notebook (ELN) technology is in analytical sciences, particularly late state biopharmaceutical development and quality. The large number of ELN users in the un-regulated world of research has inspired organizations to gain similar improvements in efficiency downstream on the R&D continuum.
It is always a pleasure to review a book on statistics (bias intended)! Especially one that is well-written, compact and well-constructed for its intended audience. Common Errors in Statistics (and How to Avoid Them), 4th Edition, is written for the non-statistician and can be readily assimilated by undergraduate students with a single statistics class under their belt.