Big Data
Subscribe to Big Data

The Lead

Techniques known as dimensionality reduction can help find patterns in the recorded activity of thousands of neurons. Rather than look at all responses at once, these methods find a smaller set of dimensions — in this case three — that capture as much str

New Platform Enables Large-Scale Neuroscience

July 28, 2014 2:23 pm | by Howard Hughes Medical Institute | News | Comments

In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It's how Facebook and Google mine your Web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.

AI Reveals The Beatles’ Dramatic Musical Transformation

July 28, 2014 12:29 pm | by Lawrence Technological University | News | Comments

Music fans and critics know that the music of the Beatles underwent a dramatic transformation in...

K Computer Runs Largest Ever Ensemble Simulation of Global Weather

July 25, 2014 2:25 pm | by RIKEN | News | Comments

Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple...

IBM Expands High Performance Computing Capabilities in the Cloud

July 24, 2014 2:18 pm | by IBM | News | Comments

IBM is making high performance computing more accessible through the cloud for clients grappling...

View Sample

FREE Email Newsletter

Internet of Things and Hadoop to be featured at ISC Big Data

July 21, 2014 2:07 pm | by ISC | News | Comments

The second ISC Big Data conference themed “From Data To Knowledge,” builds on the success of the inaugural 2013 event. A comprehensive program has been put together by the Steering Committee under the leadership of Sverre Jarp, who retired officially as the CTO of CERN openlab in March of this year.

Cray Awarded Contract to Install India's First Cray XC30 Supercomputer

July 16, 2014 3:33 am | by Cray | News | Comments

The Cray XC30 system will be used by a nation-wide consortium of scientists called the Indian Lattice Gauge Theory Initiative (ILGTI). The group will research the properties of a phase of matter called the quark-gluon plasma, which existed when the universe was approximately a microsecond old. ILGTI also carries out research on exotic and heavy-flavor hadrons, which will be produced in hadron collider experiments.

Registration Opens for ISC Cloud and ISC Big Data Conferences

July 15, 2014 11:28 am | by ISC | News | Comments

Registration is now open for the 2014 ISC Cloud and ISC Big Data Conferences, which will be held this fall in Heidelberg, Germany. The fifth ISC Cloud Conference will take place in the Marriott Hotel from September 29 to 30, and the second ISC Big Data will be held from October 1 to 2 at the same venue.


On the Trail of Paradigm-Shifting Methods for Solving Mathematical Models

July 15, 2014 10:11 am | by Hengguang Li | Blogs | Comments

How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...

IBM Announces $3B Research Initiative to Tackle Chip Grand Challenges for Cloud and Big Data Systems

July 9, 2014 4:58 pm | by IBM | News | Comments

IBM has announced it is investing $3 billion over the next five years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments are intended to push IBM's semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

Moab HPC Suite-Enterprise Edition 8.0

July 7, 2014 10:04 am | Adaptive Computing | Product Releases | Comments

Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.

'Deep Learning' Makes Search for Exotic Particles Easier

July 2, 2014 4:10 pm | by UC Irvine | News | Comments

Fully automated “deep learning” by computers greatly improves the odds of discovering particles such as the Higgs boson, beating even veteran physicists’ abilities.                         

To be able to use these huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner.

A Simple Solution for Big Data

June 27, 2014 11:19 am | by SISSA | News | Comments

To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.

Algorithm lets independent agents collectively produce a machine-learning model without aggregating data.

Robots Collaborate Independently

June 26, 2014 11:05 am | by Larry Hardesty, MIT | News | Comments

Machine learning, in which computers learn new skills by looking for patterns in training data, is the basis of most recent advances in artificial intelligence, from voice-recognition systems to self-parking cars. It’s also the technique that autonomous robots typically use to build models of their environments. That type of model-building gets complicated, however, in cases in which clusters of robots work as teams.

Jets resulting from particle collisions, like those taking place at the Large Hadron Collider (LHC) housed at CERN near Geneva, Switzerland, are quite possibly the single most important experimental signatures in high-energy physics.

High-energy Physics: Predicting the Emergence of Jets

June 19, 2014 4:03 pm | by Amber Harmon, iSGTW | News | Comments

Jets resulting from particle collisions, like those taking place at the Large Hadron Collider (LHC) housed at CERN near Geneva, Switzerland, are quite possibly the single most important experimental signatures in high-energy physics. Virtually every final-state, high-energy particle produced will be part of a jet.

Steve Conway is Research VP, IDC High Performance Computing Group.

When Massive Data Never becomes Big Data

June 18, 2014 3:38 pm | by Steve Conway, IDC | Blogs | Comments

The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.

IDC’s new in-depth forecasts are the first that track more than a dozen application and industry segments, including economically important new use cases for HPC.

IDC Announces First In-Depth Forecasts for Worldwide HPC Big Data Market

June 18, 2014 8:57 am | by IDC | News | Comments

IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year

GE Intelligent Platforms User Summit

GE Intelligent Platforms User Summit

June 16, 2014 3:26 pm | Events

GE Intelligent Platforms User Summit will address how GE is making the Industrial Internet real.  Speakers will include Jeff Immelt, Chairman and CEO, GE; Christine Furstoss, Global Technology Leader, Manufacturing Technologies for GE’s Global Research Center; Ron Reis, Senior Service Manager at GE Oil & Gas; GE Intelligent Platforms General Managers Bernie Anger and Jim Walsh; and customers from all over the world in Oil & Gas, Water, Manufacturing among others.

A new structure developed by UCLA researchers for more energy-efficient computer chips. The arrows indicate the effective magnetic field due to the structure's asymmetry. Courtesy of UCLA Engineering

Innovative Nanoscale Structure Could Yield Higher-performance Computer Memory

June 12, 2014 3:22 pm | by Matthew Chin, UCLA | News | Comments

Researchers at UCLA have created a nanoscale magnetic component for computer memory chips that could significantly improve their energy efficiency and scalability. The design brings a new and highly sought-after type of magnetic memory one step closer to being used in computers, mobile electronics such as smart phones and tablets, as well as large computing systems for big data.

The all-flash HP 3PAR StoreServ 7450 Storage array

Making Enterprise Architecture of the Future a Reality

June 10, 2014 4:29 pm | by HP | News | Comments

HP has announced new innovations and sustainable enterprise infrastructure solutions designed to deliver the simplicity, efficiency and investment protection organizations need to bridge the datacenter technologies of today and tomorrow. Big data, mobility, security and cloud computing are forcing organizations to rethink their approach to technology, causing them to invest heavily in IT infrastructure.

By means of an algorithm, increasing networking of students on Facebook can be displayed according to their age. Courtesy of Michael Hamann, KIT

Algorithms for Big Data: Optimizing Daily, Routine Processing

June 10, 2014 4:32 am | by Karlsruhe Institute of Technology | News | Comments

Computer systems today can be found in nearly all areas of life, from smartphones to smart cars to self-organized production facilities. These systems supply rapidly growing data volumes, and computer science now faces the challenge of processing these huge amounts of data (big data) in a reasonable and secure manner.

Atos, an international information technology services company, and Bull, a partner for enterprise data, together announced the intended public offer in cash by Atos for all the issued and outstanding shares in the capital of Bull.

Atos to Acquire Bull, Create Global Cloud, Cybersecurity, Big Data Company

June 2, 2014 3:33 pm | by Bull | News | Comments

Atos, an international information technology services company, and Bull, a partner for enterprise data, together announced the intended public offer in cash by Atos for all the issued and outstanding shares in the capital of Bull. Atos offer is set at 4.90 euros per Bull's share in cash, representing a 22 percent premium over the Bull's closing price

Gord Sissons, Product Marketing Manager for IBM Platform Symphony at IBM

The Evolving HPC Cluster: Big Compute meets Big Data

May 28, 2014 4:16 pm | by Gord Sissons, IBM | Blogs | Comments

HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.

Innovative tools and services have appeared to meet the data management needs created by federal requirements.

All that Big Data Is Not Going to Manage Itself: Part Two

May 28, 2014 2:18 pm | by Butch Lazorchak, Library of Congress | Blogs | Comments

My last blog post described federal government initiatives that have driven data management requirements over the past 10 years or so. Data management is a hot job area — if you tilt the digital stewardship universe a certain direction, almost everything we do falls under the rubric of “data management.” It will feature prominently in the 2015 National Agenda, to be released in conjunction with the Digital Preservation 2014 meeting.

All that Big Data Is Not Going to Manage Itself: Part One

May 27, 2014 10:56 am | by Butch Lazorchak, Library of Congress | Blogs | Comments

On February 26, 2003, the National Institutes of Health released the “Final NIH Statement on Sharing Research Data.” As you’ll be reminded when you visit that link, 2003 was eons ago in “Internet time.” Yet the vision NIH had for the expanded sharing of research data couldn’t have been more prescient. As the Open Government Data site notes, government data is a tremendous resource that can have a positive impact on ...

Researchers developed a prototype automated system that is now running on the data analytics pipeline of Bing. It's the first time automated privacy compliance analysis has been applied to the production code of an Internet-scale system.

Carnegie Mellon, Microsoft Research Automate Privacy Compliance for Big Data Systems

May 22, 2014 3:19 pm | by Carnegie Mellon University | News | Comments

Web services companies, such as Facebook, Google and Microsoft, all make promises about how they will use personal information they gather. But ensuring that millions of lines of code in their systems operate in ways consistent with privacy promises is labor-intensive and difficult. A team from Carnegie Mellon University and Microsoft Research, however, has shown these compliance checks can be automated.

For the third time in less than 10 years, IBM scientists, in collaboration with FUJIFILM, have achieved a new record in areal data density. IBM scientists in Almaden, CA, have shown that there is potential to continue scaling tape areal densities beyond 8

New Record Achieved for Storing Massive Amounts of Big Data

May 22, 2014 3:11 pm | by IBM | News | Comments

IBM researchers announced they have demonstrated a new record of 85.9 billion bits of data per square inch in areal data density on low-cost linear magnetic particulate tape — a significant update to one of the computer industry's most resilient data storage technologies for Big Data.

Event display of a proton–lead event in the LHCb detector, showing the reconstructed tracks of particles produced in the collision. The proton beam travels from left to right. Image courtesy CERN.

Handling Big Data to Understand Antimatter at CERN’s LHCb Experiment

May 21, 2014 2:11 pm | by Andrew Purcell | Articles | Comments

This year’s International Supercomputing Conference, (ISC’14) in Leipzig, Germany, is now just one month away. iSGTW speaks to Niko Neufeld ahead of his talk at the event, ‘The Boson in the Haystack,’ which will take place during the session on ‘Emerging Trends for Big Data in HPC’ on Wednesday, June 25.

The IBM Watson Group has a new headquarters at 51 Astor Place in New York City’s “Silicon Alley” technology hub, leveraging the talents of approximately 2,000 professionals, whose goal is to design, develop and accelerate the adoption of Watson cognitive

IBM Reveals Companies Developing Watson-Powered Apps

May 19, 2014 4:42 pm | by IBM | News | Comments

Technology entrepreneurs wake up every morning with the goal of creating innovations that can change the world. IBM has announced a new class of innovators that are making their visions a reality by creating apps fueled by Watson's cognitive computing intelligence.

IBM Elastic Storage

IBM Elastic Storage

May 16, 2014 2:34 pm | Ibm Corporation | Product Releases | Comments

Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.

You may login with either your assigned username or your e-mail address.
The password field is case sensitive.