Advertisement
Big Data
Subscribe to Big Data

The Lead

Michael King is Senior Director of Marketing at DataDirect Networks (DDN).

Winds of Change are Bringing Fresh Solutions to High Performance Data Storage

June 30, 2015 7:53 am | by Michael King, DataDirect Networks | Blogs | Comments

Large-scale scientific organizations are grappling with the implications of rapid data growth. Massive data collections, analytics and the need for data collaboration are driving the need for high-performance storage solutions that can deliver time to results, fast. A different breed of technologies developed originally for the supercomputing industry are being adapted to meet the needs of technical computing organizations.

PowerEdge C6320 Server

June 24, 2015 11:41 am | Product Releases | Comments

The PowerEdge C6320 server is purpose-built for high-performance computing and hyper-converged...

Cognitive Computing App Taps 10,000 Bon Appétit Recipes, Suggests Creative Flavor Combinations

June 24, 2015 11:12 am | by IBM Watson | News | Comments

IBM and Bon Appétit have introduced a one-of-a-kind Chef Watson cognitive computing cooking app...

Broad Institute Genome Analysis Toolkit offered as part of Google Genomics

June 24, 2015 7:56 am | by Broad Institute | News | Comments

Broad Institute is teaming up with Google Genomics to explore how to break down major technical...

View Sample

FREE Email Newsletter

A neuroblastoma: TGen's extended partnership with Dell will help it optimize a high-performance computing infrastructure to enable researchers to analyze and store massive amounts of genetic data more quickly and reach more patients than ever before. To d

First-of-a-kind Clinical Trials Support Fight against Pediatric Cancer

June 11, 2015 5:08 pm | by TGen | News | Comments

Dell has announced an extended partnership with TGen to help clinical researchers and doctors globally expand the reach and impact of the world's first FDA-approved precision medicine trial for pediatric cancer. The renewed commitment includes an additional $3 million Dell grant to support continued collaboration with TGen and the Neuroblastoma and NMTRC expanded pediatric cancer clinical trials in EMEA.

With keynote speakers from industry and international big-science projects, the two and one-half day New York Scientific Data Summit is organized into five sessions with topics including scientific image analysis, data fusion, environmental and urban scie

Accelerating Data-driven Discovery and Innovation: New York Scientific Data Summit

June 5, 2015 9:00 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

New York Scientific Data Summit is a no-fee annual meeting that aims to accelerate data-driven discovery and innovation by bringing together researchers, developers and end-users from academia, industry, utilities and state and federal governments. Jointly organized by Brookhaven National Laboratory, Stony Brook University and New York University, this year’s conference will take place from August 2 to 5, 2015, at NYU.

ISC Events is excited to announce that registration is now open for the inaugural ISC Cloud & Big Data event, which will be held this fall in Frankfurt, Germany. The entire conference will take place at the Frankfurt Marriott hotel, located in the city ce

Registration Opens for ISC Cloud & Big Data

May 28, 2015 3:40 pm | by ISC | News | Comments

ISC Events has announced that registration is now open for the inaugural ISC Cloud & Big Data event, which will be held this fall in Frankfurt, Germany. The entire conference will take place at the Frankfurt Marriott hotel, located in the city center. The three-day event will kick off with a full-day workshop on September 28, followed by the main program on September 29 and 30.

Advertisement
The Internet contains a vast trove of information -- sometimes called the "Deep Web" -- that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scient

'Deep Web' Searching in the Name of Science

May 26, 2015 2:21 pm | by Elizabeth Landau, NASA | News | Comments

The Internet contains a vast trove of information - sometimes called the "Deep Web" - that isn't indexed by search engines: information that would be useful for tracking criminals, terrorist activities, sex trafficking and the spread of diseases. Scientists could also use it to search for images and data from spacecraft. 

When you look at this photograph, what colors are the dress?

Stories You Shouldn’t Miss — May 15-21

May 22, 2015 11:56 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

In case you haven’t caught them yet, here's a recap of this week's most popular stories. Looking at the universe as a hologram; diesel fuel from carbon dioxide and water; first observations of a rare subatomic process; a big data history of music charts; secrets of colossal, invisible waves; perceptions of dress colors; and more are among the top hits.

Pat McGarry is Vice President of Engineering with Ryft Systems

Deriving Real Time Value from Big Data

May 22, 2015 9:51 am | by Pat McGarry, Ryft Systems | Blogs | Comments

Everyone has heard the old adage that time is money. In today’s society, business moves at the speed of making a phone call, looking something up online via your cell phone, or posting a tweet. So, when time is money (and can be a lot of money), why are businesses okay with waiting weeks or even months to get valuable information from their data?

Investigators have applied NetGestalt to data from The Cancer Genome Atlas (TCGA) colorectal cancer cohort, the first tumor dataset with complete molecular measurements at DNA, RNA and protein levels.

User-friendly Data Query, Visualization Tools Enable Omics Data Integration

May 19, 2015 4:21 pm | by Leigh MacMillan, Vanderbilt University | News | Comments

Advances in technology have generated vast amounts of “omics” data: genomic, epigenomic, transcriptomic, proteomic and metabolomic changes for all types of specimens. Bridging the gap between data generation and investigators’ ability to retrieve and interpret data is essential to realize the biological and clinical value of this wealth of information.

Researchers used methods from signal processing and text-mining to analyze the musical properties of songs. Their system automatically grouped the thousands of songs by patterns of chord changes and tone, allowing them to statistically identify trends wit

Big Data Analysis of Sounds Creates 50-year Evolutionary History of Music Charts

May 14, 2015 9:18 am | by Queen Mary University of London | News | Comments

Evolutionary biologists and computer scientists have come together study the evolution of pop music. Their analysis of 17,000 songs from the US Billboard Hot 100 charts, 1960 to 2010, is the most substantial scientific study of the history of popular music to date. They studied trends in style, the diversity of the charts, and the timing of musical revolutions.

Advertisement
The ISC Cloud & Big Data Research Committee is accepting submissions of high-quality papers in theoretical, experimental, industrial research and development until Tuesday, May 19, 2015.

Last Chance to Submit ISC Cloud & Big Data Research Papers

May 13, 2015 12:05 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19, 2015. The Research Paper Sessions “aim to provide first-class open forums for engineers and scientists in academia, industry and government to present and discuss issues, trends and results to shape the future of cloud computing and big data.” The sessions will be held on Tuesday, September 29 and on Wednesday, September 30, 2015.

The new program builds on IBM Research advancements in analytics and existing Watson collaborations to develop a genome data analysis solution for clinicians. Partners involved in the program will use Watson Genomic Analytics, a new solution specifically

14 Leading Cancer Institutes Collaborate to Advance Personalized Medicine for Cancer Patients

May 6, 2015 12:33 pm | by IBM | News | Comments

IBM Watson is collaborating with more than a dozen leading cancer institutes to accelerate the ability of clinicians to identify and personalize treatment options for their patients. The institutes will apply Watson's advanced cognitive capabilities to reduce from weeks to minutes the ability to translate DNA insights, understand a person's genetic profile and gather relevant information from medical literature to personalize treatment.

Michael Morris is General Manager at Appirio.

How Crowdsourcing can Solve Even Interstellar Problems

May 5, 2015 2:16 pm | by Michael Morris, Appirio | Blogs | Comments

Protecting the world from destruction by asteroids sounds like superhuman power, but NASA scientists work tirelessly to ensure that humans today are protected from this potential harm. Asteroids need to be hunted in order to identify which ones may endanger Earth, and analyzing the big data puzzle of asteroid detection has been an arduous process. That is, until the power of crowdsourcing was discovered.

EMC has announced new enhancements across its Documentum portfolio of enterprise content management (ECM) applications.

EMC Enhances Documentum Enterprise Content Management Applications

May 4, 2015 12:42 pm | by EMC | News | Comments

As the emergence of social media, cloud and big data continues to fuel the digital evolution, today’s digital workplace must drive new levels of employee engagement, operational efficiency and service excellence. To help deliver on this digital transformation, EMC has announced new enhancements across its Documentum portfolio of ECM applications, enabling users to further address next-generation ECM.

Dr. Jan Vitt, Head of IT Infrastructure, DZ Bank

ISC Cloud & Big Data Keynote Will Recount Bank’s Path into Cloud Computing

April 30, 2015 4:17 pm | by ISC | News | Comments

As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.

Advertisement
SQream DB GPU-based Columnar SQL Database

SQream DB GPU-based Columnar SQL Database

April 30, 2015 10:49 am | by SQream Technologies | Product Releases | Comments

SQream DB is a high-speed GPU-based columnar SQL database designed to uniquely address the speed, scalability and efficiency hurdles that face big data analytics. It is capable of processing and analyzing high volumes of data, while delivering a high cost/performance ratio. 

The search for life beyond our solar system requires unprecedented cooperation across scientific disciplines. NASA's NExSS collaboration includes those who study Earth as a life-bearing planet (lower right), those researching the diversity of solar system

NASA’s NExSS Coalition to Lead Search for Life on Distant Worlds

April 28, 2015 3:58 pm | by NASA | News | Comments

NASA is bringing together experts spanning a variety of scientific fields for an unprecedented initiative dedicated to the search for life on planets outside our solar system. The Nexus for Exoplanet System Science, or “NExSS,” hopes to better understand the various components of an exoplanet, as well as how the planet stars and neighbor planets interact to support life.

The functional genetic network shown is just one of the 144 such networks identified for a diverse set of human tissues and cell types. Courtesy of Simons Center for Data Analysis

Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse Genomic Big Data Collections

April 28, 2015 3:36 pm | by Simons Foundation | News | Comments

A multi-year study led by researchers from the Simons Center for Data Analysis and major universities and medical schools has broken substantial new ground, establishing how genes work together within 144 different human tissues and cell types in carrying out those tissues’ functions. The paper also demonstrates how computer science and statistical methods may combine to analyze genomic ‘big-data’ collections.

Mike Hoard is Senior Staff, Product Marketing, at Seagate Cloud Systems and Solutions.

Hadoop on Lustre: A Storage Blueprint for Deriving Value from Data

April 27, 2015 4:00 pm | by Mike Hoard, Seagate Cloud Systems and Solutions | Blogs | Comments

As ubiquitous as the term “big data” has become, the path for drawing real, actionable insights hasn’t always been as clear. And the need is only becoming greater as organizations generate greater and greater amounts of structured and unstructured data. While data-intensive computing is not new to (HPC environments, newer analytic frameworks, including Hadoop, are emerging as viable compasses for navigating the complex amounts of data.

Univa Universal Resource Broker, Powered by Grid Engine

Univa Universal Resource Broker, Powered by Grid Engine

April 27, 2015 10:26 am | Univa, Inc. | Product Releases | Comments

Universal Resource Broker is an enterprise-class workload optimization solution for high performance, containerized and shared data centers. It is designed to enable organizations to achieve massive scalability of shared data center resources and to lay the foundation for the Internet of Things.

The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory Facility

April 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments

The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.

The new method reduces computing power needed to process large amounts of multidimensional relational data by providing a simple technique of cutting down redundant layers of information, reducing the amount of data to be processed.

Mathematicians Reduce Big Data Using Ideas from Quantum Theory

April 23, 2015 2:02 pm | by Queen Mary University of London | News | Comments

A new technique of visualizing the complicated relationships between anything from Facebook users to proteins in a cell provides a simpler and cheaper method of making sense of large volumes of data.

Leo Reiter is a cloud computing pioneer who has been designing, developing, and evangelizing large scale, on demand systems and technologies since the mid-1990s. Currently, Leo serves as Chief Technology Officer of Nimbix, Inc., a global provider of High

Big Data is Driving HPC to the Cloud

April 21, 2015 2:09 pm | by Leo Reiter, CTO, Nimbix, Inc. | Blogs | Comments

For many computationally-intensive applications, such as simulation, seismic processing and rendering, overall speed is still the name of the game. However, new branch of HPC is gaining momentum. IDC calls it “High Performance Data Analysis” (HPDA for short). Essentially, it’s the union of big data and HPC. How will these architectures evolve? Let’s start by looking at the data.

The new Cray XC40 supercomputer and Sonexion storage system, Powered by Seagate, will provide PGS with the advanced computational capabilities necessary to run highly complex seismic processing and imaging applications. These applications include imaging

Seagate Storage Technology Powering Four Cray Advanced HPC Implementations

April 21, 2015 11:49 am | by Seagate | News | Comments

Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.

Salford Predictive Modeler Data Mining Software Suite

Salford Predictive Modeler Data Mining Software Suite

April 21, 2015 11:16 am | Salford Systems | Product Releases | Comments

The Salford Predictive Modeler (SPM) software suite includes CART, MARS, TreeNet and Random Forests, as well as powerful automation and modeling capabilities. The software is designed to be a highly accurate and ultra-fast analytics and data mining platform for creating predictive, descriptive and analytical models from databases of any size, complexity or organization.

Researchers tracked asthma-related tweets around the world, shown in the visualization above, then zoomed in on a particular region to see how the social media posts, when analyzed alongside other data, could help them predict asthma-related emergency roo

How Twitter Can Help Predict Emergency Room Visits

April 16, 2015 12:16 pm | by Alexis Blue, University of Arizona | News | Comments

A predictive model using machine learning algorithms is able to predict with 75 percent accuracy how many asthma-related emergency room visits a hospital could expect on a given day. Twitter users who post information about their personal health online might be considered by some to be "over-sharers," but new research suggests that health-related tweets may have the potential to be helpful for hospitals.

Ryft One

Ryft One

April 16, 2015 10:34 am | Ryft Systems, Inc. | Product Releases | Comments

Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading