Advertisement
Big Data
Subscribe to Big Data

The Lead

The Internet contains a vast trove of information -- sometimes called the "Deep Web" -- that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scient

'Deep Web' Searching in the Name of Science

May 26, 2015 2:21 pm | by Elizabeth Landau, NASA | News | Comments

The Internet contains a vast trove of information - sometimes called the "Deep Web" - that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scientists could also use it to search for images and data from spacecraft. 

Stories You Shouldn’t Miss — May 15-21

May 22, 2015 11:56 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

In case you haven’t caught them yet, here's a recap of this week's most popular stories. Looking...

Deriving Real Time Value from Big Data

May 22, 2015 9:51 am | by Pat McGarry, Ryft Systems | Blogs | Comments

Everyone has heard the old adage that time is money. In today’s society, business moves at the...

User-friendly Data Query, Visualization Tools Enable Omics Data Integration

May 19, 2015 4:21 pm | by Leigh MacMillan, Vanderbilt University | News | Comments

Advances in technology have generated vast amounts of “omics” data: genomic, epigenomic,...

View Sample

FREE Email Newsletter

Researchers used methods from signal processing and text-mining to analyze the musical properties of songs. Their system automatically grouped the thousands of songs by patterns of chord changes and tone, allowing them to statistically identify trends wit

Big Data Analysis of Sounds Creates 50-year Evolutionary History of Music Charts

May 14, 2015 9:18 am | by Queen Mary University of London | News | Comments

Evolutionary biologists and computer scientists have come together study the evolution of pop music. Their analysis of 17,000 songs from the US Billboard Hot 100 charts, 1960 to 2010, is the most substantial scientific study of the history of popular music to date. They studied trends in style, the diversity of the charts, and the timing of musical revolutions.

The ISC Cloud & Big Data Research Committee is accepting submissions of high-quality papers in theoretical, experimental, industrial research and development until Tuesday, May 19, 2015.

Last Chance to Submit ISC Cloud & Big Data Research Papers

May 13, 2015 12:05 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19, 2015. The Research Paper Sessions “aim to provide first-class open forums for engineers and scientists in academia, industry and government to present and discuss issues, trends and results to shape the future of cloud computing and big data.” The sessions will be held on Tuesday, September 29 and on Wednesday, September 30, 2015.

The new program builds on IBM Research advancements in analytics and existing Watson collaborations to develop a genome data analysis solution for clinicians. Partners involved in the program will use Watson Genomic Analytics, a new solution specifically

14 Leading Cancer Institutes Collaborate to Advance Personalized Medicine for Cancer Patients

May 6, 2015 12:33 pm | by IBM | News | Comments

IBM Watson is collaborating with more than a dozen leading cancer institutes to accelerate the ability of clinicians to identify and personalize treatment options for their patients. The institutes will apply Watson's advanced cognitive capabilities to reduce from weeks to minutes the ability to translate DNA insights, understand a person's genetic profile and gather relevant information from medical literature to personalize treatment.

Advertisement
Michael Morris is General Manager at Appirio.

How Crowdsourcing can Solve Even Interstellar Problems

May 5, 2015 2:16 pm | by Michael Morris, Appirio | Blogs | Comments

Protecting the world from destruction by asteroids sounds like superhuman power, but NASA scientists work tirelessly to ensure that humans today are protected from this potential harm. Asteroids need to be hunted in order to identify which ones may endanger Earth, and analyzing the big data puzzle of asteroid detection has been an arduous process. That is, until the power of crowdsourcing was discovered.

EMC has announced new enhancements across its Documentum portfolio of enterprise content management (ECM) applications.

EMC Enhances Documentum Enterprise Content Management Applications

May 4, 2015 12:42 pm | by EMC | News | Comments

As the emergence of social media, cloud and big data continues to fuel the digital evolution, today’s digital workplace must drive new levels of employee engagement, operational efficiency and service excellence. To help deliver on this digital transformation, EMC has announced new enhancements across its Documentum portfolio of ECM applications, enabling users to further address next-generation ECM.

Dr. Jan Vitt, Head of IT Infrastructure, DZ Bank

ISC Cloud & Big Data Keynote Will Recount Bank’s Path into Cloud Computing

April 30, 2015 4:17 pm | by ISC | News | Comments

As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.

SQream DB GPU-based Columnar SQL Database

SQream DB GPU-based Columnar SQL Database

April 30, 2015 10:49 am | by SQream Technologies | Product Releases | Comments

SQream DB is a high-speed GPU-based columnar SQL database designed to uniquely address the speed, scalability and efficiency hurdles that face big data analytics. It is capable of processing and analyzing high volumes of data, while delivering a high cost/performance ratio. 

The search for life beyond our solar system requires unprecedented cooperation across scientific disciplines. NASA's NExSS collaboration includes those who study Earth as a life-bearing planet (lower right), those researching the diversity of solar system

NASA’s NExSS Coalition to Lead Search for Life on Distant Worlds

April 28, 2015 3:58 pm | by NASA | News | Comments

NASA is bringing together experts spanning a variety of scientific fields for an unprecedented initiative dedicated to the search for life on planets outside our solar system. The Nexus for Exoplanet System Science, or “NExSS,” hopes to better understand the various components of an exoplanet, as well as how the planet stars and neighbor planets interact to support life.

Advertisement
The functional genetic network shown is just one of the 144 such networks identified for a diverse set of human tissues and cell types. Courtesy of Simons Center for Data Analysis

Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse Genomic Big Data Collections

April 28, 2015 3:36 pm | by Simons Foundation | News | Comments

A multi-year study led by researchers from the Simons Center for Data Analysis and major universities and medical schools has broken substantial new ground, establishing how genes work together within 144 different human tissues and cell types in carrying out those tissues’ functions. The paper also demonstrates how computer science and statistical methods may combine to analyze genomic ‘big-data’ collections.

Mike Hoard is Senior Staff, Product Marketing, at Seagate Cloud Systems and Solutions.

Hadoop on Lustre: A Storage Blueprint for Deriving Value from Data

April 27, 2015 4:00 pm | by Mike Hoard, Seagate Cloud Systems and Solutions | Blogs | Comments

As ubiquitous as the term “big data” has become, the path for drawing real, actionable insights hasn’t always been as clear. And the need is only becoming greater as organizations generate greater and greater amounts of structured and unstructured data. While data-intensive computing is not new to (HPC environments, newer analytic frameworks, including Hadoop, are emerging as viable compasses for navigating the complex amounts of data.

Univa Universal Resource Broker, Powered by Grid Engine

Univa Universal Resource Broker, Powered by Grid Engine

April 27, 2015 10:26 am | Univa, Inc. | Product Releases | Comments

Universal Resource Broker is an enterprise-class workload optimization solution for high performance, containerized and shared data centers. It is designed to enable organizations to achieve massive scalability of shared data center resources and to lay the foundation for the Internet of Things.

The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory Facility

April 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments

The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.

The new method reduces computing power needed to process large amounts of multidimensional relational data by providing a simple technique of cutting down redundant layers of information, reducing the amount of data to be processed.

Mathematicians Reduce Big Data Using Ideas from Quantum Theory

April 23, 2015 2:02 pm | by Queen Mary University of London | News | Comments

A new technique of visualizing the complicated relationships between anything from Facebook users to proteins in a cell provides a simpler and cheaper method of making sense of large volumes of data.

Advertisement
Leo Reiter is a cloud computing pioneer who has been designing, developing, and evangelizing large scale, on demand systems and technologies since the mid-1990s. Currently, Leo serves as Chief Technology Officer of Nimbix, Inc., a global provider of High

Big Data is Driving HPC to the Cloud

April 21, 2015 2:09 pm | by Leo Reiter, CTO, Nimbix, Inc. | Blogs | Comments

For many computationally-intensive applications, such as simulation, seismic processing and rendering, overall speed is still the name of the game. However, new branch of HPC is gaining momentum. IDC calls it “High Performance Data Analysis” (HPDA for short). Essentially, it’s the union of big data and HPC. How will these architectures evolve? Let’s start by looking at the data.

The new Cray XC40 supercomputer and Sonexion storage system, Powered by Seagate, will provide PGS with the advanced computational capabilities necessary to run highly complex seismic processing and imaging applications. These applications include imaging

Seagate Storage Technology Powering Four Cray Advanced HPC Implementations

April 21, 2015 11:49 am | by Seagate | News | Comments

Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.

Salford Predictive Modeler Data Mining Software Suite

Salford Predictive Modeler Data Mining Software Suite

April 21, 2015 11:16 am | Salford Systems | Product Releases | Comments

The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.

Researchers tracked asthma-related tweets around the world, shown in the visualization above, then zoomed in on a particular region to see how the social media posts, when analyzed alongside other data, could help them predict asthma-related emergency roo

How Twitter Can Help Predict Emergency Room Visits

April 16, 2015 12:16 pm | by Alexis Blue, University of Arizona | News | Comments

A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.

Ryft One

Ryft One

April 16, 2015 10:34 am | Ryft Systems, Inc. | Product Releases | Comments

Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.

Technological advances are enabling scientists to sequence the genomes of cancer tumors, revealing a detailed portrait of genetic mutations that drive these diseases. But genomic studies are only one piece of the puzzle that is precision medicine. In orde

Big Data Key to Precision Medicine's Success

April 15, 2015 4:04 pm | by Weill Cornell Medical College | News | Comments

Technological advances are enabling scientists to sequence the genomes of cancer tumors, revealing a detailed portrait of genetic mutations that drive these diseases. But genomic studies are only one piece of the puzzle that is precision medicine. In order to realize the promise of this field, there needs to be an increased focus on creating robust clinical databases.

A new technology in development has the potential to revolutionize the sourcing of renewable energy from rivers.

Big Data Finds Ideal River Locations for Hydro-Power

April 13, 2015 3:41 pm | by University of Leicester | News | Comments

A new technology in development has the potential to revolutionize the sourcing of renewable energy from rivers.

Genomics processing is now moving mainstream to clinical applications, as new approaches to diagnosing and treatment involving genomics are gaining interest.

Efficient, Time Sensitive Execution of Next-gen Sequencing Pipelines Critical for Translational Medicine

April 6, 2015 3:26 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

Demand for genomics processing is rapidly spreading from research labs to the clinical arena. Genomics is now a "must have" tool for researchers in areas of oncology and rare diseases. It is also becoming a requirement in the clinical space for precision medicine, translational medicine and similar "bench to bedside" initiatives.

Hubble telescope image of stars forming inside a cloud of cold hydrogen gas and dust in the Carina Nebula, 7,500 light-years away. Courtesy of Space Telescope Science Institute

Automation Provides Big Data Solution to Astronomy’s Data Deluge

April 2, 2015 9:40 am | by David Tenenbaum, University of Wisconsin-Madison | News | Comments

It’s almost a rite of passage in physics and astronomy. Scientists spend years scrounging up money to build a fantastic new instrument. Then, when the long-awaited device finally approaches completion, the panic begins: How will they handle the torrent of data? The Square Kilometer Array will have an unprecedented ability to deliver data on the location and properties of stars, galaxies and giant clouds of hydrogen gas.

In New York City, Manju Malkani, IBM analytics consultant, and Paul Walsh, Vice President of Weather Analytics at The Weather Company, access real-time weather data through IBM Watson Analytics.

The Weather Company Migrates Data Services to IBM Cloud, Plans to Advance Internet of Things Solutions

March 31, 2015 1:43 pm | by IBM | News | Comments

IBM and The Weather Company have announced a global strategic alliance to integrate real-time weather insights into business to improve operational performance and decision-making. As part of the alliance, The Weather Company, including its global B2B division WSI, will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.

Tri-TON, an Autonomous Underwater Vehicle (AUV) that U.S. and Japanese researchers will use for the real-time verification of their search olfactory algorithms. Courtesy of Tamer Zaki, Johns Hopkins University

U.S., Japan Bring Big Data and Data Analytics to Disaster Response

March 31, 2015 12:29 pm | by NSF | News | Comments

When disaster strikes, it is critical that experts, decision makers and emergency personnel have access to real-time information in order to assess the situation and respond appropriately. It is equally critical that individuals and organizations have the capacity to analyze the wealth of data generated in the midst of the disaster and its immediate aftermath in order to produce accurate, customized warnings.

MOVIA Big Data Analytics Platform

MOVIA Big Data Analytics Platform

March 30, 2015 1:38 pm | by Modus Operandi, Inc. | Modus Operandi, Inc. | Product Releases | Comments

MOVIA Big Data Analytics Platform is designed to help organizations watch for important patterns in their data and generate instant alerts to users or other systems. The software enables improved prediction of trends through advanced data modeling that captures situational context, so decisions are not ‘made in a vacuum.’

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading