Advertisement
Big Data
Subscribe to Big Data

The Lead

Molecule and Deep Learning – Frey’s team used computational deep learning techniques to train a system that mimics the process of splicing in the cell (left panel). Features such as motifs, RNA secondary structures and nucleosome positions are computation

Deep Learning Reveals Unexpected Genetic Roots of Cancers, Autism and Other Disorders

December 18, 2014 4:23 pm | by The University of Toronto | News | Comments

In the decade since the genome was sequenced, scientists and doctors have struggled to answer an all-consuming question: Which DNA mutations cause disease? A new computational technique developed at the University of Toronto may now be able to tell us. A team has developed the first method for ‘ranking’ genetic mutations based on how living cells ‘read’ DNA, revealing how likely any given alteration is to cause disease.

Big Data Analysis Reveals Shared Genetic Code between Species

December 18, 2014 11:32 am | by Mike Williams, Rice University | News | Comments

Researchers have detected at least three instances of cross-species mating that likely...

VA Clinical Reasoning System Based on Watson Cognitive Capabilities

December 17, 2014 3:45 pm | News | Comments

IBM announced that the U.S. Department of Veterans Affairs is using Watson technology in a pilot...

ISC Cloud & Big Data Conferences to Merge in 2015

December 16, 2014 12:11 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

ISC has announced the ISC Cloud & Big Data conference, which has merged into a three-day...

View Sample

FREE Email Newsletter

Big Data and genetic complexity: HotNet2 helps define the terrain for complex genetic associations involved in cancer. “The next step,” says researcher Ben Raphael, “is translating all of this information from cancer sequencing into clinically actionable

Big Data v. Cancer: Algorithm Identifies Genetic Changes across Cancers

December 15, 2014 4:00 pm | by Brown University | News | Comments

Using a computer algorithm that can sift through mounds of genetic data, researchers from Brown University have identified several networks of genes that, when hit by a mutation, could play a role in the development of multiple types of cancer. The algorithm, called Hotnet2, was used to analyze genetic data from 12 different types of cancer assembled as part of the pan-cancer project of The Cancer Genome Atlas (TCGA).

Researchers will track the lives of people with multiple sclerosis (MS) in unprecedented detail in OPTIMISE — a project to improve the evaluation of treatments.

Big Data Project Captures Multiple Sclerosis Patient Experience

December 11, 2014 3:43 pm | by Francesca Davenport, Imperial College London | News | Comments

MS affects more than two million people worldwide. Symptoms are different for everyone but commonly include fatigue, tingling, speech problems and difficulties with walking and balance. To gain a better understanding of MS and its treatments, there is a need for a system to collect comprehensive data that provides an in-depth picture of the experiences of MS patients across a large population.

Leaders in science, engineering, government, and industry will address fast-moving opportunities and challenges in the field of “big data” at the Virginia Summit on Science, Engineering, and Medicine.

Big Data Challenges at Virginia Academy Summit

December 4, 2014 5:25 pm | by Virginia Tech | News | Comments

Leaders in science, engineering, government, and industry will address fast-moving opportunities and challenges in the field of “big data” at the Virginia Summit on Science, Engineering, and Medicine.              

Advertisement

Unlocking the Potential of Big Data in the Cloud

December 4, 2014 3:38 pm | by IMDEA Networks Institute | News | Comments

Two ICT initiatives are filling technology headlines these days, promising to revolutionize computing, business practice, education and most areas of knowledge one can think of.                     

In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing

Computer Equal To or Better Than Humans at Cataloging Science

December 2, 2014 2:53 pm | by David Tenenbaum, University of Wisconsin-Madison | News | Comments

In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the University of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.

IBM is investing in Pathway to position both companies on the cutting edge of offering truly personalized wellness information.

Evidence-based Medicine: Bringing Big Data to Healthcare Consumers

November 26, 2014 9:31 am | by Kalorama Information | News | Comments

The IBM Watson Group's investment in Pathway Genomics is a model for the types of partnerships that are bringing Big Data to the healthcare consumer marketplace. IBM hopes to use Watson, their cognitive technology, and Big Data — enormous medical datasets — to transform the quality and speed of care delivered to individuals through individualized, evidence-based medicine.

The U.S. Department of Energy has awarded IBM contracts valued at over $300 million to develop and deliver the world’s most advanced “data centric” supercomputing systems at Lawrence Livermore and Oak Ridge National Laboratories to advance innovation and

IBM 'Data Centric' Systems Tackle Big Data Challenges

November 17, 2014 4:34 pm | Ibm Corp. | News | Comments

The U.S. Department of Energy has awarded IBM contracts valued at over $300 million to develop and deliver the world’s most advanced “data centric” supercomputing systems at Lawrence Livermore and Oak Ridge National Laboratories to advance innovation and discovery in science, engineering and national security.

David Turek is Vice President, Technical Computing OpenPOWER at IBM Corporation

The World of Supercomputing Must Become Data Centric

November 17, 2014 8:37 am | by David Turek, IBM | Blogs | Comments

It has been a commonly held belief that supercomputing capability is a predictable phenomenon with the "fastest" system in the world increasing in power by three orders of magnitude about every 11 years. I put the term "fastest" in quotes, because very few ask the question: Fastest in what way? It turns out that this notion of "fastest" is limited to a narrow consideration of system performance that focuses on floating point capability.

Advertisement
Radhika Kulkarni, Ph.D. is SAS Vice President for Advanced Analytics R&D, and a 2014 INFORMS Fellow.

Predictive Analytics: Harnessing Insights from Text and Network Data

November 14, 2014 10:44 am | by Radhika Kulkarni, Ph.D., SAS | Blogs | Comments

The predictive analytics landscape covers a wide variety of techniques and methods designed to derive insights from data. These techniques have been used successfully for many years on structured data. In recent times, the volume and variety of data available for analysis has exploded, and most of this data is in non-traditional forms.

IBM and Pathway Genomics are aiming to revolutionize the health and wellness industry by leveraging the natural language processing and cognitive capabilities of Watson. For the first time consumers will be able to ask the Pathway Panorama app questions t

IBM Watson Group Invests in Pathway Genomics

November 13, 2014 2:28 pm | by IBM | News | Comments

Cognitive apps are in market today and continue to change the way professionals and consumers make decisions. To help accelerate this transformation, the IBM Watson Group announced an investment in Pathway Genomics, a clinical laboratory that offers genetic testing services globally, to help deliver the first-ever cognitive consumer-facing app based on genetics from a user’s personal makeup.

Black pine (Pinus nigra), one of the species whose life history data is part of the database, is seen against a stunning backdrop of New Zealand. Courtesy of Yvonne Buckley

Big Data Takes Root in the World of Plant Research

November 12, 2014 3:47 pm | by Trinity College Dublin | News | Comments

Botanists have launched a database with information that documents significant ‘life events’ for nearly 600 plant species across the globe. They clubbed together with like-minded individuals working across five different continents to compile the huge database of plant life histories, for which data have been gathered over a near 50-year span.

Products and services based on TIBCO's Fast Data platform are designed to enable businesses worldwide to turn big data into a differentiator.

TIBCO Announces New Ease-of-Use Enhancements to Fast Data Capabilities

November 3, 2014 11:37 am | by TIBCO Software | News | Comments

TIBCO Software has announced improvements to the company's Fast Data capabilities, enabling IT and business users to leverage today’s rapidly changing business environment. Products and services based on TIBCO's Fast Data platform are designed to enable businesses worldwide to turn big data into a differentiator.

Jaspersoft 5.6 Open Source Release

TIBCO Jaspersoft 5.6 Open Source Release

October 31, 2014 11:29 am | Tibco | Product Releases | Comments

The TIBCO Jaspersoftbusiness intelligence platform includes native big data connectors with support for Apache Hadoop, Apache Hive, Apache Cassandra and Cloudera, helping to deliver real-time reporting. Additional interactive reporting features provide a customizable experience for users.

Advertisement
Indiana University received one of the largest individual awards from the NSF’s $31 million Data Infrastructure Building Blocks program this year. Researchers will use the $5 million in funding to help boost the nation’s big data efforts. Courtesy of NSF

NSF Awards $5M to Empower Researchers with New Data Analysis Tools

October 29, 2014 10:15 am | by Indiana University Bloomington | News | Comments

A team of computer scientists working to improve how researchers across the sciences empower big data to solve problems have been awarded $5 million by the National Science Foundation. The team will address one of the leading challenges in tackling some of the world’s most pressing issues in science: the ability to analyze and compute large amounts of data.

LLNL researcher Monte LaBute was part of a Lab team that recently published an article in PLOS ONE detailing the use of supercomputers to link proteins to drug side effects. Courtesy of Julie Russell/LLNL

Supercomputers Link Proteins to Adverse Drug Reactions

October 21, 2014 10:40 am | by Kenneth K Ma, Lawrence Livermore National Laboratory | News | Comments

The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.

Diver collecting microbial samples from Australian seaweeds for Uncovering Genome Mysteries

Crowdsourced Supercomputing Examines Big Genomic Data

October 21, 2014 9:31 am | by IBM | News | Comments

What do the DNA in Australian seaweed, Amazon River water, tropical plants, and forest soil all have in common? Lots, say scientists. And understanding the genetic similarities of disparate life forms could enable researchers to produce compounds for new medicines, eco-friendly materials, more resilient crops, and cleaner air, water and energy.

An IBM logo displayed in Berlin, VT. IBM is paying $1.5 billion to Globalfoundries in order to shed its costly chip division. (AP Photo/Toby Talbot)

IBM to Pay $1.5B to Shed Costly Chip Division

October 20, 2014 10:54 am | by Michelle Chapman, AP Business Writer | News | Comments

IBM will pay $1.5 billion to Globalfoundries in order to shed its costly chip division. IBM Director of Research John E. Kelly III said in an interview on October 20, 2104, that handing over control of the semiconductor operations will allow it to grow faster, while IBM continues to invest in and expand its chip research.

Urika-XA System for Big Data Analytics

Cray Urika-XA System for Big Data Analytics

October 16, 2014 9:53 am | Cray Inc. | Product Releases | Comments

The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-integrated with the Apache Hadoop and Apache Spark frameworks. It is designed to provide users with the benefits of a turnkey analytics appliance combined with a flexible, open platform that can be modified for future analytics workloads.

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science

2015 Rice Oil & Gas High Performance Computing Workshop

October 13, 2014 2:45 pm | by Rice University | Events

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.

On Tuesday, October 7, in New York City, IBM Watson Group Senior Vice President Mike Rhodin and travel entrepreneur Terry Jones attended the opening of IBM Watson's global headquarters in New York City's Silicon Alley. Terry Jones is launching a new compa

IBM Watson Fuels Next Generation of Cognitive Computing

October 13, 2014 11:32 am | by IBM | News | Comments

Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.

A new principle, called data smashing, estimates the similarities between streams of arbitrary data without human intervention, and without access to the data sources.

Data Smashing Could Unshackle Automated Discovery

October 8, 2014 11:45 am | by Cornell University | News | Comments

A little-known secret in data mining is that simply feeding raw data into a data analysis algorithm is unlikely to produce meaningful results. New discoveries often begin with comparison of data streams to find connections and spot outliers. But most data comparison algorithms today have one major weakness — somewhere, they rely on a human expert. But experts aren’t keeping pace with the complexities of big data.

On Tuesday, October 7, IBM Watson Group Vice President, Client Experience Centers, Ed Harbour opens the IBM Watson global headquarters in New York City's Silicon Alley. (Courtesy Jon Simon/Feature Photo Service for IBM)

IBM Watson Global Headquarters Opens for Business in Silicon Alley

October 8, 2014 10:33 am | by IBM | News | Comments

IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.

IBM has announced new capabilities for its System z mainframe.

IBM Delivers New Analytics Offerings for the Mainframe to Provide Real-Time Customer Insights

October 7, 2014 2:09 pm | by IBM | News | Comments

Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.

Bernie Spang is Vice President of Software Defined Strategy at IBM.

Scientific Research and Big Data: It Starts with Storage

September 24, 2014 11:52 am | by Bernie Spang, IBM | Blogs | Comments

For centuries, scientific research has been about data, and as data in research continues to grow exponentially, so does the importance of how it’s stored. A key example of how the scientific field can tackle Big Data storage is DESY, a scientific research organization dedicated to providing scientists worldwide faster access to insights into samples, making optimal data management in a high-volume environment extremely critical.

On September 20, early-bird pricing for the ISC Cloud and ISC Big Data registrations will be replaced with regular registration fees.

Early Bird Rate for ISC Cloud and Big Data Conferences to End Soon

September 16, 2014 3:03 pm | by ISC | News | Comments

On September 20, early-bird pricing for the ISC Cloud and ISC Big Data registrations will be replaced with regular registration fees. With the regular rates, the passes will cost 100 Euro more for each conference, and the combined conference ticket, which allows attendees to participate in both events, will cost 150 Euro more. Thus, ISC is encouraging attendees to register this week in order to benefit from the current savings.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading