In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It's how Facebook and Google mine your Web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.
Music fans and critics know that the music of the Beatles underwent a dramatic transformation in...
Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple...
IBM is making high performance computing more accessible through the cloud for clients grappling...
The second ISC Big Data conference themed “From Data To Knowledge,” builds on the success of the inaugural 2013 event. A comprehensive program has been put together by the Steering Committee under the leadership of Sverre Jarp, who retired officially as the CTO of CERN openlab in March of this year.
The Cray XC30 system will be used by a nation-wide consortium of scientists called the Indian Lattice Gauge Theory Initiative (ILGTI). The group will research the properties of a phase of matter called the quark-gluon plasma, which existed when the universe was approximately a microsecond old. ILGTI also carries out research on exotic and heavy-flavor hadrons, which will be produced in hadron collider experiments.
Registration is now open for the 2014 ISC Cloud and ISC Big Data Conferences, which will be held this fall in Heidelberg, Germany. The fifth ISC Cloud Conference will take place in the Marriott Hotel from September 29 to 30, and the second ISC Big Data will be held from October 1 to 2 at the same venue.
How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...
IBM Announces $3B Research Initiative to Tackle Chip Grand Challenges for Cloud and Big Data SystemsJuly 9, 2014 4:58 pm | by IBM | News | Comments
IBM has announced it is investing $3 billion over the next five years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments are intended to push IBM's semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.
Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.
Fully automated “deep learning” by computers greatly improves the odds of discovering particles such as the Higgs boson, beating even veteran physicists’ abilities.
To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.
Machine learning, in which computers learn new skills by looking for patterns in training data, is the basis of most recent advances in artificial intelligence, from voice-recognition systems to self-parking cars. It’s also the technique that autonomous robots typically use to build models of their environments. That type of model-building gets complicated, however, in cases in which clusters of robots work as teams.
Jets resulting from particle collisions, like those taking place at the Large Hadron Collider (LHC) housed at CERN near Geneva, Switzerland, are quite possibly the single most important experimental signatures in high-energy physics. Virtually every final-state, high-energy particle produced will be part of a jet.
The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.
IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year
GE Intelligent Platforms User Summit will address how GE is making the Industrial Internet real. Speakers will include Jeff Immelt, Chairman and CEO, GE; Christine Furstoss, Global Technology Leader, Manufacturing Technologies for GE’s Global Research Center; Ron Reis, Senior Service Manager at GE Oil & Gas; GE Intelligent Platforms General Managers Bernie Anger and Jim Walsh; and customers from all over the world in Oil & Gas, Water, Manufacturing among others.
Researchers at UCLA have created a nanoscale magnetic component for computer memory chips that could significantly improve their energy efficiency and scalability. The design brings a new and highly sought-after type of magnetic memory one step closer to being used in computers, mobile electronics such as smart phones and tablets, as well as large computing systems for big data.
HP has announced new innovations and sustainable enterprise infrastructure solutions designed to deliver the simplicity, efficiency and investment protection organizations need to bridge the datacenter technologies of today and tomorrow. Big data, mobility, security and cloud computing are forcing organizations to rethink their approach to technology, causing them to invest heavily in IT infrastructure.
Computer systems today can be found in nearly all areas of life, from smartphones to smart cars to self-organized production facilities. These systems supply rapidly growing data volumes, and computer science now faces the challenge of processing these huge amounts of data (big data) in a reasonable and secure manner.
Atos, an international information technology services company, and Bull, a partner for enterprise data, together announced the intended public offer in cash by Atos for all the issued and outstanding shares in the capital of Bull. Atos offer is set at 4.90 euros per Bull's share in cash, representing a 22 percent premium over the Bull's closing price
HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.
My last blog post described federal government initiatives that have driven data management requirements over the past 10 years or so. Data management is a hot job area — if you tilt the digital stewardship universe a certain direction, almost everything we do falls under the rubric of “data management.” It will feature prominently in the 2015 National Agenda, to be released in conjunction with the Digital Preservation 2014 meeting.
On February 26, 2003, the National Institutes of Health released the “Final NIH Statement on Sharing Research Data.” As you’ll be reminded when you visit that link, 2003 was eons ago in “Internet time.” Yet the vision NIH had for the expanded sharing of research data couldn’t have been more prescient. As the Open Government Data site notes, government data is a tremendous resource that can have a positive impact on ...
Web services companies, such as Facebook, Google and Microsoft, all make promises about how they will use personal information they gather. But ensuring that millions of lines of code in their systems operate in ways consistent with privacy promises is labor-intensive and difficult. A team from Carnegie Mellon University and Microsoft Research, however, has shown these compliance checks can be automated.
IBM researchers announced they have demonstrated a new record of 85.9 billion bits of data per square inch in areal data density on low-cost linear magnetic particulate tape — a significant update to one of the computer industry's most resilient data storage technologies for Big Data.
This year’s International Supercomputing Conference, (ISC’14) in Leipzig, Germany, is now just one month away. iSGTW speaks to Niko Neufeld ahead of his talk at the event, ‘The Boson in the Haystack,’ which will take place during the session on ‘Emerging Trends for Big Data in HPC’ on Wednesday, June 25.
Technology entrepreneurs wake up every morning with the goal of creating innovations that can change the world. IBM has announced a new class of innovators that are making their visions a reality by creating apps fueled by Watson's cognitive computing intelligence.
Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.
- Page 1