Advertisement
Analytics
Subscribe to Analytics

The Lead

IBM introduces Storage as a Service on SoftLayer for High Performance Data Management in Cloud

July 15, 2014 11:38 am | by IBM | News | Comments

IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud,  that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.

On the Trail of Paradigm-Shifting Methods for Solving Mathematical Models

July 15, 2014 10:11 am | by Hengguang Li | Blogs | Comments

How using CPU/GPU parallel computing is the next logical step - My work in...

National Data Service kicks off with the Materials Data Facility

July 9, 2014 4:12 pm | by Amber Harmon | News | Comments

In nearly every field of science, experiments, instruments, observations, sensors, simulations,...

Moab HPC Suite-Enterprise Edition 8.0

July 7, 2014 10:04 am | Product Releases | Comments

Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by...

View Sample

FREE Email Newsletter

Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.

Computer-aided Diagnosis of Rare Genetic Disorders from Family Photos

June 30, 2014 11:04 am | by University of Oxford | News | Comments

Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.                           

To be able to use these huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner.

A Simple Solution for Big Data

June 27, 2014 11:19 am | by SISSA | News | Comments

To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.

Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution

The Great Salmon Run Algorithm

June 25, 2014 10:42 am | by Inderscience Publishers | News | Comments

Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution to a given problem.

Advertisement
Steve Conway is Research VP, IDC High Performance Computing Group.

When Massive Data Never becomes Big Data

June 18, 2014 3:38 pm | by Steve Conway, IDC | Blogs | Comments

The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.

IDC’s new in-depth forecasts are the first that track more than a dozen application and industry segments, including economically important new use cases for HPC.

IDC Announces First In-Depth Forecasts for Worldwide HPC Big Data Market

June 18, 2014 8:57 am | by IDC | News | Comments

IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year

Gord Sissons, Product Marketing Manager for IBM Platform Symphony at IBM

The Evolving HPC Cluster: Big Compute meets Big Data

May 28, 2014 4:16 pm | by Gord Sissons, IBM | Blogs | Comments

HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.

Stefan Groschupf, CEO and Co-founder, Datameer

Pumping Data: How Data Analytics is the New Athletic Advantage

May 21, 2014 11:26 am | by Stefan Groschupf, CEO and Co-founder, Datameer | Blogs | Comments

In a sport where milliseconds matter, the 2012 U.S. Women’s Olympic cycling team found their competitive edge in an unlikely place – data science. The team went from a five-second deficit at the world championships to earning a Silver medal in the 2012 London Olympics — a triumphant feat that was achieved not only through dedication and athletic ability, but also through enhancing training with insights gained from analyzing big data.

The IBM Watson Group has a new headquarters at 51 Astor Place in New York City’s “Silicon Alley” technology hub, leveraging the talents of approximately 2,000 professionals, whose goal is to design, develop and accelerate the adoption of Watson cognitive

IBM Reveals Companies Developing Watson-Powered Apps

May 19, 2014 4:42 pm | by IBM | News | Comments

Technology entrepreneurs wake up every morning with the goal of creating innovations that can change the world. IBM has announced a new class of innovators that are making their visions a reality by creating apps fueled by Watson's cognitive computing intelligence.

Advertisement
IBM Elastic Storage

IBM Elastic Storage

May 16, 2014 2:34 pm | Ibm Corporation | Product Releases | Comments

Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.

Spotfire 6.5 Analytics Platform

Spotfire 6.5 Analytics Platform

April 29, 2014 10:39 am | Tibco | Product Releases | Comments

Spotfire 6.5 analytics platform allows users to easily connect to diverse data sources, including spatial data sources, and create rich visualizations, enabling analytics from the simplest to the most complex levels. Features include the single-seat Spotfire desktop product, which provides the full power and ease of use of the Spotfire platform for individual users...

The IDC HPC User Forum will meet at HLRS in Stuttgart and another location in October 2014.

IDC HPC User Forum Europe 2014

April 28, 2014 2:56 pm | by IDC | Events

The IDC HPC User Forum will meet at HLRS in Stuttgart and another location in October 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.

In this April 17, 2014 photo, President Barack Obama speaks in the White House briefing room in Washington. A White House review of how the government and private sector use large sets of data has found that such information could be used to discriminate

Discrimination Potential Seen in Big Data Use

April 28, 2014 11:35 am | by Eileen Sullivan, Associated Press | News | Comments

A White House review of how the government and private sector use large sets of data has found that such information could be used to discriminate against Americans on issues such as housing and employment even as it makes their lives easier in many ways. "Big data" is everywhere.

Dedicated to the next generation of lab informatics applications and building a searchable, shareable database to improve decision making and efficiency

ELNs, Data Analytics and Knowledge Management Summit

April 25, 2014 12:06 pm | by EDKM | Events

The only event dedicated to the next generation of lab informatics applications and building a searchable, shareable database to improve decision making and efficiency. After thesuccess of last year’s inaugural event ELNs, Data Analytics and Knowledge Management event in the US, Pharma IQ has announced the second EDKM conference to be held on 17th & 18th June 2014 in Boston US.

Advertisement

Computer Maps 21 Distinct Emotional Expressions

March 31, 2014 5:35 pm | by Pam Frost Gorder, Ohio State University | News | Comments

Researchers have found a way for computers to recognize 21 distinct facial expressions - even expressions for complex or seemingly contradictory emotions such as “happily disgusted” or “sadly angry.”               

The IDC HPC User Forum will meet at The Grand Hyatt Seattle, September 15 to 17, 2014.

IDC 54th HPC User Forum Seattle

March 30, 2014 2:23 pm | by IDC | Events

The IDC HPC User Forum will meet at The Grand Hyatt Seattle, September 15 to 17, 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.

How the Flu Bug Bit Google: Where Big Data Analysis Can Go Wrong

March 14, 2014 3:19 pm | by Marisa Ramirez, University of Houston | News | Comments

Ryan Kennedy, University of Houston political science professor, and his co-researchers detail new research about the problematic use of big data from aggregators such as Google’s Google Flu Trend. Numbers and data can be critical tools in bringing complex issues into a crisp focus. The understanding of diseases, for example, benefits from algorithms that help monitor their spread. But without context, a number may just be a number

Simplifying Data Analysis and Making Sense of Big Data

March 14, 2014 2:25 pm | by Wallace Ravven, University of California, Berkeley | News | Comments

Ben Recht is looking for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor because, he says, the researchers, engineers and business people he has worked with usually don’t know enough about their data to reach their goal.

Streamlining Big Data Analysis Improves Accuracy and Performance

March 12, 2014 3:57 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

Next week, Scientific Computing will host a live panel discussion that looks at how a unique supercomputing system, created to serve the needs of a scientific community alliance in seven northern German states, has unified datacenter resources to address big data challenges. By streamlining the analysis process through automation, the HLRN alliance has improved performance and increased accuracy, resulting in greater efficiency.

Be an Asteroid Hunter in NASA's First Asteroid Grand Challenge

March 11, 2014 11:18 am | by NASA | News | Comments

NASA’s Asteroid Data Hunter contest series will offer $35,000 in awards over the next six months to citizen scientists who develop improved algorithms that can be used to identify asteroids. This contest is being conducted in partnership with Planetary Resources of Bellevue, WA.

Big Workflow: The Future of Big Data Computing

March 7, 2014 4:04 pm | by Robert Clyde, Adaptive Computing | Blogs | Comments

How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.

Big Compute: The Collision of where HPC is Meeting the Challenges of Big Data

March 7, 2014 3:59 pm | by Jason Stowe, Cycle Computing | Blogs | Comments

At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We started talking about this concept of Big Compute back in Oct. 2012. In many ways, it’s the collision of where HPC is meeting the challenges of Big Data. As our technical capabilities continue to expand in the ways we can collect and store data, the problem of how we access and use data is only growing.

Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and Computation

March 7, 2014 3:52 pm | by Barry Bolding, Cray | Blogs | Comments

Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing.  The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.

Big Data & HPC: The Modern Crystal Ball for 2014

March 7, 2014 3:41 pm | by Jorge Titinger, SGI | Blogs | Comments

In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”

High Performance Data Analysis: Big Data Meets HPC

March 7, 2014 3:33 pm | by Steve Conway, IDC | Blogs | Comments

From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.

Big Data Meets HPC

March 7, 2014 2:49 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Articles | Comments

Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading