How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We...
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding | Blogs | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-...
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
The 10-day tour of Europe was not your typical itinerary — Garching, Karlsruhe, Villigen, Hamburg and Oxford. In January. But David Brown and Craig Tull of the Computational Research Division and Alex Hexemer of the Advanced Light Source weren’t touring to see the sights — they more interested in seeing the lights — powerful scientific instruments known as light sources that use intense X-rays to study materials
Size alone does not define big data — it is best defined as a combination of volume, velocity, variety and value.Kevin Geraghty, head of Analytics 360i defined the goal of big data analytics well when he said: “We are trying to listen to what the customer is telling us through their behavior.” The goal of big data analytics is to make the best business decisions possible.
Just as Netflix uses an algorithm to recommend movies we ought to see, a Stanford software system offers by-the-moment advice to thousands of server-farm computers on how to efficiently share the workload. We hear a lot about the future of computing in the cloud, but not much about the efficiency of the data centers that make the cloud possible, where clusters work together to host applications ranging from big data analytics
The Russian Ministry of Education and Science has awarded a $3.4 million “mega-grant” to Alexei Klimentov, Physics Applications Software Group Leader at the U.S. Department of Energy’s Brookhaven National Laboratory, to develop new “big data” computing tools for the advancement of science.
AT&T and IBM have announced a new global alliance agreement to develop solutions that help support the "Internet of Things." The companies will combine their analytic platforms, cloud and security technologies with privacy in mind to gain more insights on data collected from machines in a variety of industries.
Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months. However, a team working with Beagle, one of the world's fastest supercomputers devoted to life sciences, reports that genome analysis can be radically accelerated. This computer is able to analyze 240 full genomes in about two days.
The Intel Xeon processor E7 v2 family delivers capabilities to process and analyze large, diverse amounts of data to unlock information that was previously inaccessible. The processor family has triple the memory capacity of the previous generation processor family, allowing much faster and thorough data analysis.
IBM announced that it has achieved a new technological advancement that will helpimprove Internet speeds to 200-400 Gigabits per second (Gb/s) at extremely low power. The speed boost is based on a device that can be used to improve transferring Big Data between clouds and data centers four times faster than current technology. At this speed, 160 Gigabytes could be downloaded in only a few seconds.
HPC matters, now more than ever. What better way to show how it matters than through your submission to the SC14 Technical Program? Technical Program submissions opened, February 14th for Research Papers, Posters (Regular, Education, and ACM Student Research Competition), Panels, Tutorials, BOF Sessions, Scientific Visualization and Data Analytics Showcase, Emerging Technologies, and Doctoral Showcase.
It is now possible to sense scientific data as a means to deal with the mountains of information we face in our environment by applying subconscious processing to big data analysis. Imagine that data could be transposed into a tactile experience.
A major event in France and Europe, TERATEC Forum brings together the top international experts in high performance numerical design and simulation, confirming the strategic importance of these technologies for developing industrial competitiveness and innovation capacity. For its 9th edition, the Forum Teratec will be, for over 1000 professionals, the right place to be, with its hot topics, plenary sessions, technical workshops and exhibition of hardware, software and service providers.
Thomas Edison received 1,093 patents during his lifetime for inventions spanning everything from the tattoo machine to the electric grid. His innovative mind inspired President Ronald Reagan to celebrate National Inventors’ Day on the anniversary of Edison’s birthday on February 11. Edison’s curious spirit never the left the company he started. Three GE scientists received the...
HPCS is a multi-disciplinary conference, considered Canada's premier advanced computing forum. Each year, Canadian researchers, analysts, and IT professionals from academia and industry gather to exchange the ideas, tools, and new discoveries that are driving today's innovations in computational research.
IBM has launched a 10-year initiative to bring Watson and other cognitive systems to Africa in a bid to fuel development and spur business opportunities across the world's fastest growing continent. Dubbed "Project Lucy" after the earliest known human ancestor, IBM will invest US$100 million in the initiative
Computational scientists now have the opportunity to apply for the upcoming Argonne Training Program on Extreme-Scale Computing (ATPESC), to take place from August 3-15, 2014. The program provides intensive hands-on training on the key skills, approaches and tools to design, implement, and execute computational science and engineering applications on current supercomputers and the HPC systems of the future.
The HPC Advisory Council and the Swiss Supercomputing Centre will host the HPC Advisory Council Switzerland Conference 2014. The conference will focus on High-Performance Computing essentials, new developments and emerging technologies, best practices and hands-on training.
As computers enter ever more areas of our daily lives, the amount of data they produce has grown enormously. But for this “big data” to be useful it must first be analyzed, meaning it needs to be stored in such a way that it can be accessed quickly when required.
Texas A&M System Teams with IBM to Drive Computational Sciences Research through Big Data and AnalyticsJanuary 29, 2014 1:38 pm | by IBM | News | Comments
Texas A&M University System and IBM announced an agreement that is the beginning of a broad research collaboration supported by one of the largest computational sciences infrastructures dedicated to advances in agriculture, geosciences and engineering.
On Monday, January 27, over 250 people gathered at the small evangelical church in Daisbach – a very small, quiet town in Southern Germany – to bid farewell to Hans Meuer, the founder of TOP500 and the ISC General Chair. Hans passed away on January 20 at the age of 77.
The goal of this conference is to bring together all the developers and researchers involved in solving the software challenges of the exascale era. The conference focuses on issues of applications for exascale and the associated tools, software programming models and libraries.
ISC General Chair, Prof. Dr. Hans Werner Meuer passed away at the age of 77 at his home in Daisbach, Southern Germany, on January 20, 2014, after a brief battle with cancer. Meuer has been involved in data processing since 1960. He served as specialist, project leader, group and department chief during his 11 years, from 1962 – 1973, at the Research Center in Jülich, Germany.
The sixth generation of enterprise X-Architecture for System x and PureSystems servers, provides improvements in the performance and economics of x86-based systems for analytics and cloud. As users adopt analytics for greater business insight and move critical workloads like ERP, analytics and database to the cloud for increased efficiency and lower costs, x86-based systems are a popular choice.
- Page 1