From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We...
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding, Cray | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-...
The TOP500 project was launched in 1993 to improve and renew the Mannheim supercomputer statistics, which had been in use for seven years. Our simple TOP500 approach does not define “supercomputer” as such, but we use a benchmark to rank systems and to decide on whether or not they qualify for the TOP500 list.
Is Big Data really the biggest challenge at the moment for translational science? Certainly there are issues with the complexity and size of omics data, which Big Data techniques can help address, but there are two more pressing challenges: enabling collaboration whilst facilitating information sharing, and the ability to better interpret multiple different omics data (multi-omics).
In the current issue of HPC Source, we explore some of the latest advances in “Power & Cooling” and share expert viewpoints on topics ranging from strategies for coping with escalating power and cooling requirements, to a look at the Tokyo Institute of Technology’s prototype TSUBAME-KFC system, to an examination of today’s liquid-cooling hardware. We also delve into some of the big unknowns in the future of power and cooling.
Today's enterprises face unique challenges. In the past, the requirement was to upgrade. Today, it's about building an integrated strategy that involves multiple technologies both existing and new. For example, there's more diversity in database technology than ever before, server technology and data center infrastructure, to name a few. At the moment, none of these technologies are replacing the others; instead, they need to be integrated.
Next week, Scientific Computing will host a live panel discussion that looks at how a unique supercomputing system, created to serve the needs of a scientific community alliance in seven northern German states, has unified datacenter resources to address big data challenges. By streamlining the analysis process through automation, the HLRN alliance has improved performance and increased accuracy, resulting in greater efficiency.
All signs indicate a healthy continuing demand for mobile technology that can support ever-more-demanding eye-candy and apps on very-high-resolution display devices. According to independent high performance computing expert Rob Farber, mobile tech is where the money is right now in computer technology.
How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
Scientific Computing is excited to be celebrating its 30th year in 2014, and we have a terrific line-up of new things we will be introducing throughout the coming months. This includes a new global cross-platform app that is available across multiple devices and allows you to browse and read each issue anytime, anywhere. In our latest issue, we explore the theme of “Emerging Technologies” ...
Smartphones are an important part of our everyday life, a trend that holds true for laboratories as well. In countless industries, features and applications on mobile devices allow staff working outside of the lab to easily and more accurately capture new types of data from remote locations. But without proper technology, working remotely also poses unique challenges.
Size alone does not define big data — it is best defined as a combination of volume, velocity, variety and value.Kevin Geraghty, head of Analytics 360i defined the goal of big data analytics well when he said: “We are trying to listen to what the customer is telling us through their behavior.” The goal of big data analytics is to make the best business decisions possible.
The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies, such as monoclonal antibodies.
On Monday, January 27, over 250 people gathered at the small evangelical church in Daisbach – a very small, quiet town in Southern Germany – to bid farewell to Hans Meuer, the founder of TOP500 and the ISC General Chair. Hans passed away on January 20 at the age of 77.
Ah, 30,000 feet and some old Dire Straits on the headphones, and waiting for my warm Heineken. Perfect. Though I enjoy lambasting companies that get it wrong, I’m also quick to stomp my feet and clap my hands when companies get it right. And I’ll do that, I promise, but allow me my fun first.
Josiah Stamp said: “The individual source of the statistics may easily be the weakest link.” Nowhere is this more true than in the new field of text mining, given the wide variety of textual information. By some estimates, 80 percent of the information available occurs as free-form text which, prior to the development of text mining, needed to be read in its entirety in order for information to be obtained from it.
A computer program is running 24 hours a day at Carnegie Mellon University, searching the Web for images, doing its best to understand them on its own and, as it builds a growing visual database, gathering common sense on a massive scale. The Never Ending Image Learner, or NEIL, leverages recent advances in computer vision that enable computer programs to identify and label objects in images