Advertisement
Big Data
Subscribe to Big Data

The Lead

MIT’s Computer Science and Artificial Intelligence Laboratory has released a data-visualization tool that lets users highlight aberrations and possible patterns in the graphical display; the tool then automatically determines which data sources are respon

Visual Control of Big Data: Recomputing Visualizations without Aberrant Results

August 20, 2014 10:44 am | by Larry Hardesty, MIT | News | Comments

In the age of big data, visualization tools are vital. With a single glance at a graphic display, a human being can recognize patterns that a computer might fail to find even after hours of analysis. But what if there are aberrations in the patterns? Or what if there’s just a suggestion of a visual pattern that’s not distinct enough to justify any strong inferences? Or what if the pattern is clear, but not what was to be expected?

New Supercomputing Center to Support Big Data and Analytics, Cybersecurity and other STEM Skills

August 18, 2014 3:57 pm | by IBM | News | Comments

Florida Polytechnic University, Flagship Solutions Group and IBM have announced a new...

Blue Waters Project to Offer Graduate Visualization Course in Spring 2015

August 18, 2014 12:12 pm | by NCSA | News | Comments

NCSA’s Blue Waters project will offer a graduate course on High Performance Visualization for...

CEEDS Project: New Ways of Exploring Big Data

August 14, 2014 3:36 pm | by European Commission, CORDIS | News | Comments

In a society that has to understand increasingly big and complex datasets, EU researchers are...

View Sample

FREE Email Newsletter

Michael J. Fox speaks at Lotusphere 2012. The potential to collect and analyze data from thousands of individuals on measurable features of Parkinson's, such as slowness of movement, tremor and sleep quality, could enable researchers to assemble a better

Intel, Michael J. Fox Foundation Start Smartwatch Study

August 14, 2014 3:25 pm | by Intel | News | Comments

The Michael J. Fox Foundation for Parkinson's Research (MJFF) and Intel have announced a collaboration aimed at improving research and treatment for Parkinson's disease — a neurodegenerative brain disease second only to Alzheimer's in worldwide prevalence. The collaboration includes a multiphase research study using a new big data analytics platform that detects patterns in participant data collected from wearable technologies.

Prof. Dr. Stefan Wrobel,Director, Fraunhofer Institute for Intelligent Analysis & Information Systems (IAIS) and Professor of Computer Science, University of Bonn

Prof. Dr. Stefan Wrobel

August 14, 2014 12:29 pm | Biographies

Prof. Dr. Stefan Wrobel, M.S., is director of the Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS) and Professor of Computer Science at University of Bonn. He studied Computer Science in Bonn and Atlanta, GA, USA (M.S. degree, Georgia Institute of Technology), receiving his doctorate from University of Dortmund.

Dirk Slama, Business Development Director, Bosch Si

Dirk Slama

August 14, 2014 12:15 pm | Biographies

Dirk Slama is Director of Business Development at Bosch Software Innovations. Bosch SI is spearheading the Internet of Things (IoT) activities of Bosch, the global engineering group. As Conference Chair of the Bosch ConnectedWorld, Dirk helps shaping the IoT strategy of Bosch. Dirk has over 20 years experience in very large-scale application projects, system integration and Business Process Management. His international work experience includes projects for Lufthansa Systems, Boeing, AT&T, NTT DoCoMo, HBOS and others.

Advertisement
A brain-inspired chip to transform mobility and Internet of Things through sensory perception. Courtesy of IBM

Chip with Brain-inspired Non-Von Neumann Architecture has 1M Neurons, 256M Synapses

August 11, 2014 12:13 pm | by IBM | News | Comments

Scientists from IBM have unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of one million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. At 5.4 billion transistors, this fully functional and production-scale chip is currently one of the largest CMOS chips ever built, yet, while running at biological real time, it consumes a minuscule 70mW.

Optalysys is currently developing two products, a ‘Big Data’ analysis system and an Optical Solver Supercomputer, both of which are expected to be launched in 2017.

Light-speed Computing: Prototype Optical Processor Set to Revolutionize Supercomputing

August 8, 2014 4:13 pm | by Optalysys | News | Comments

Cambridge UK-based start up Optalysys has stated that it is only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer. The company will demonstrate its prototype, which meets NASA Technology Readiness Level 4, in January of next year.

ESnet's Eli Dart moved 56 TB of climate data from 21 sites to NERSC, a task that took three months. In contrast, it took just two days to transfer the raw dataset using Globus from NERSC  to the Mira supercomputer at Argonne National Laboratory.

Weathering the Flood of Big Data in Climate Research

August 6, 2014 4:16 pm | by ESnet | News | Comments

Big Data, it seems, is everywhere, usually characterized as a Big Problem. But researchers at Lawrence Berkeley National Laboratory are adept at accessing, sharing, moving and analyzing massive scientific datasets. At a July 14-16, 2014, workshop focused on climate science, Berkeley Lab experts shared their expertise with other scientists working with big datasets.

George Vacek is life sciences global director at DataDirect Networks.

Enabling Innovation and Discovery through Data-Intensive High Performance Cloud and Big Data Infrastructure

July 29, 2014 2:34 pm | by George Vacek, DataDirect Networks | Blogs | Comments

As the size and scale of life sciences datasets increases — think large-cohort longitudinal studies with multiple samples and multiple protocols — so does the challenge of storing, interpreting and analyzing this data. Researchers and data scientists are under increasing pressure to identify the most relevant and critical information within massive and messy data sets, so they can quickly make the next discovery.

Techniques known as dimensionality reduction can help find patterns in the recorded activity of thousands of neurons. Rather than look at all responses at once, these methods find a smaller set of dimensions — in this case three — that capture as much str

New Platform Enables Large-Scale Neuroscience

July 28, 2014 2:23 pm | by Howard Hughes Medical Institute | News | Comments

In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It's how Facebook and Google mine your Web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.

Advertisement
The automatic placement of the albums by the algorithm was in agreement with the chronological order of the recording of each Beatles albums.

AI Reveals The Beatles’ Dramatic Musical Transformation

July 28, 2014 12:29 pm | by Lawrence Technological University | News | Comments

Music fans and critics know that the music of the Beatles underwent a dramatic transformation in just a few years. But, until now, there hasn’t been a scientific way to measure the progression. Computer scientists at Lawrence Technological University have developed an artificial intelligence algorithm that can analyze and compare musical styles, enabling research into their musical progression.

K computer installed in the computer room. Each computer rack is equipped with about 100 CPUs. In the Computer Building, 800 or more computer racks are installed for the K computer.  Courtesy of Riken

K Computer Runs Largest Ever Ensemble Simulation of Global Weather

July 25, 2014 2:25 pm | by RIKEN | News | Comments

Ensemble forecasting is a key part of weather forecasting. Computers typically run multiple simulations using slightly different initial conditions or assumptions, and then analyze them together to try to improve forecasts. Using Japan’s K computer, researchers have succeeded in running 10,240 parallel simulations of global weather, the largest number ever performed, using data assimilation to reduce the range of uncertainties.

IBM Expands High Performance Computing Capabilities in the Cloud

July 24, 2014 2:18 pm | by IBM | News | Comments

IBM is making high performance computing more accessible through the cloud for clients grappling with big data and other computationally intensive activities. A new option from SoftLayer will provide industry-standard InfiniBand networking technology to connect SoftLayer bare metal servers. This will enable very high data throughput speeds between systems, allowing companies to move workloads traditionally associated with HPC to the cloud.

Internet of Things and Hadoop to be featured at ISC Big Data

July 21, 2014 2:07 pm | by ISC | News | Comments

The second ISC Big Data conference themed “From Data To Knowledge,” builds on the success of the inaugural 2013 event. A comprehensive program has been put together by the Steering Committee under the leadership of Sverre Jarp, who retired officially as the CTO of CERN openlab in March of this year.

Cray Awarded Contract to Install India's First Cray XC30 Supercomputer

July 16, 2014 3:33 am | by Cray | News | Comments

The Cray XC30 system will be used by a nation-wide consortium of scientists called the Indian Lattice Gauge Theory Initiative (ILGTI). The group will research the properties of a phase of matter called the quark-gluon plasma, which existed when the universe was approximately a microsecond old. ILGTI also carries out research on exotic and heavy-flavor hadrons, which will be produced in hadron collider experiments.

Advertisement

Registration Opens for ISC Cloud and ISC Big Data Conferences

July 15, 2014 11:28 am | by ISC | News | Comments

Registration is now open for the 2014 ISC Cloud and ISC Big Data Conferences, which will be held this fall in Heidelberg, Germany. The fifth ISC Cloud Conference will take place in the Marriott Hotel from September 29 to 30, and the second ISC Big Data will be held from October 1 to 2 at the same venue.

On the Trail of Paradigm-Shifting Methods for Solving Mathematical Models

July 15, 2014 10:11 am | by Hengguang Li | Blogs | Comments

How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...

IBM Announces $3B Research Initiative to Tackle Chip Grand Challenges for Cloud and Big Data Systems

July 9, 2014 4:58 pm | by IBM | News | Comments

IBM has announced it is investing $3 billion over the next five years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments are intended to push IBM's semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

Moab HPC Suite-Enterprise Edition 8.0

July 7, 2014 10:04 am | Adaptive Computing | Product Releases | Comments

Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.

'Deep Learning' Makes Search for Exotic Particles Easier

July 2, 2014 4:10 pm | by UC Irvine | News | Comments

Fully automated “deep learning” by computers greatly improves the odds of discovering particles such as the Higgs boson, beating even veteran physicists’ abilities.                         

To be able to use these huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner.

A Simple Solution for Big Data

June 27, 2014 11:19 am | by SISSA | News | Comments

To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.

Algorithm lets independent agents collectively produce a machine-learning model without aggregating data.

Robots Collaborate Independently

June 26, 2014 11:05 am | by Larry Hardesty, MIT | News | Comments

Machine learning, in which computers learn new skills by looking for patterns in training data, is the basis of most recent advances in artificial intelligence, from voice-recognition systems to self-parking cars. It’s also the technique that autonomous robots typically use to build models of their environments. That type of model-building gets complicated, however, in cases in which clusters of robots work as teams.

Jets resulting from particle collisions, like those taking place at the Large Hadron Collider (LHC) housed at CERN near Geneva, Switzerland, are quite possibly the single most important experimental signatures in high-energy physics.

High-energy Physics: Predicting the Emergence of Jets

June 19, 2014 4:03 pm | by Amber Harmon, iSGTW | News | Comments

Jets resulting from particle collisions, like those taking place at the Large Hadron Collider (LHC) housed at CERN near Geneva, Switzerland, are quite possibly the single most important experimental signatures in high-energy physics. Virtually every final-state, high-energy particle produced will be part of a jet.

Steve Conway is Research VP, IDC High Performance Computing Group.

When Massive Data Never becomes Big Data

June 18, 2014 3:38 pm | by Steve Conway, IDC | Blogs | Comments

The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.

IDC’s new in-depth forecasts are the first that track more than a dozen application and industry segments, including economically important new use cases for HPC.

IDC Announces First In-Depth Forecasts for Worldwide HPC Big Data Market

June 18, 2014 8:57 am | by IDC | News | Comments

IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year

GE Intelligent Platforms User Summit

GE Intelligent Platforms User Summit

June 16, 2014 3:26 pm | Events

GE Intelligent Platforms User Summit will address how GE is making the Industrial Internet real.  Speakers will include Jeff Immelt, Chairman and CEO, GE; Christine Furstoss, Global Technology Leader, Manufacturing Technologies for GE’s Global Research Center; Ron Reis, Senior Service Manager at GE Oil & Gas; GE Intelligent Platforms General Managers Bernie Anger and Jim Walsh; and customers from all over the world in Oil & Gas, Water, Manufacturing among others.

A new structure developed by UCLA researchers for more energy-efficient computer chips. The arrows indicate the effective magnetic field due to the structure's asymmetry. Courtesy of UCLA Engineering

Innovative Nanoscale Structure Could Yield Higher-performance Computer Memory

June 12, 2014 3:22 pm | by Matthew Chin, UCLA | News | Comments

Researchers at UCLA have created a nanoscale magnetic component for computer memory chips that could significantly improve their energy efficiency and scalability. The design brings a new and highly sought-after type of magnetic memory one step closer to being used in computers, mobile electronics such as smart phones and tablets, as well as large computing systems for big data.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading