All signs indicate a healthy continuing demand for mobile technology that can support ever-more-demanding eye-candy and apps on very-high-resolution display devices. According to independent high performance computing expert Rob Farber, mobile tech is where the money is right now in computer technology.
How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We started talking about this concept of Big Compute back in Oct. 2012. In many ways, it’s the collision of where HPC is meeting the challenges of Big Data. As our technical capabilities continue to expand in the ways we can collect and store data, the problem of how we access and use data is only growing.
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing. The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.
In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Scientific Computing is excited to be celebrating its 30th year in 2014, and we have a terrific line-up of new things we will be introducing throughout the coming months. This includes a new global cross-platform app that is available across multiple devices and allows you to browse and read each issue anytime, anywhere. In our latest issue, we explore the theme of “Emerging Technologies” ...
Smartphones are an important part of our everyday life, a trend that holds true for laboratories as well. In countless industries, features and applications on mobile devices allow staff working outside of the lab to easily and more accurately capture new types of data from remote locations. But without proper technology, working remotely also poses unique challenges.
Size alone does not define big data — it is best defined as a combination of volume, velocity, variety and value.Kevin Geraghty, head of Analytics 360i defined the goal of big data analytics well when he said: “We are trying to listen to what the customer is telling us through their behavior.” The goal of big data analytics is to make the best business decisions possible.
The rapid increase of investment in biotherapeutics is changing the profile of the biopharmaceutical industry and, along with it, data management in the laboratory. With attention on longer patent life, high barriers to generic equivalents and personalized medicine, an increasing portion of R&D spending is being allocated to large molecule therapies, such as monoclonal antibodies.
On Monday, January 27, over 250 people gathered at the small evangelical church in Daisbach – a very small, quiet town in Southern Germany – to bid farewell to Hans Meuer, the founder of TOP500 and the ISC General Chair. Hans passed away on January 20 at the age of 77.
Ah, 30,000 feet and some old Dire Straits on the headphones, and waiting for my warm Heineken. Perfect. Though I enjoy lambasting companies that get it wrong, I’m also quick to stomp my feet and clap my hands when companies get it right. And I’ll do that, I promise, but allow me my fun first.
Josiah Stamp said: “The individual source of the statistics may easily be the weakest link.” Nowhere is this more true than in the new field of text mining, given the wide variety of textual information. By some estimates, 80 percent of the information available occurs as free-form text which, prior to the development of text mining, needed to be read in its entirety in order for information to be obtained from it.
A computer program is running 24 hours a day at Carnegie Mellon University, searching the Web for images, doing its best to understand them on its own and, as it builds a growing visual database, gathering common sense on a massive scale. The Never Ending Image Learner, or NEIL, leverages recent advances in computer vision that enable computer programs to identify and label objects in images
When scientists from around the world visit Dula Parkinson’s microtomography beamline at Berkeley Lab’s Advanced Light Source, they all want the same thing: amazing, scientifically illuminating, micron-scale X-ray views of matter, whether a fiber-reinforced ceramic composite, an energy-rich shale, or a dinosaur bone fragment. Unfortunately, many of them have left lately with something else: debilitating data overload.