IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud, that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.
How using CPU/GPU parallel computing is the next logical step - My work in...
In nearly every field of science, experiments, instruments, observations, sensors, simulations,...
Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.
To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.
Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution to a given problem.
The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.
IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year
HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.
In a sport where milliseconds matter, the 2012 U.S. Women’s Olympic cycling team found their competitive edge in an unlikely place – data science. The team went from a five-second deficit at the world championships to earning a Silver medal in the 2012 London Olympics — a triumphant feat that was achieved not only through dedication and athletic ability, but also through enhancing training with insights gained from analyzing big data.
Technology entrepreneurs wake up every morning with the goal of creating innovations that can change the world. IBM has announced a new class of innovators that are making their visions a reality by creating apps fueled by Watson's cognitive computing intelligence.
Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.
Spotfire 6.5 analytics platform allows users to easily connect to diverse data sources, including spatial data sources, and create rich visualizations, enabling analytics from the simplest to the most complex levels. Features include the single-seat Spotfire desktop product, which provides the full power and ease of use of the Spotfire platform for individual users...
The IDC HPC User Forum will meet at HLRS in Stuttgart and another location in October 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.
A White House review of how the government and private sector use large sets of data has found that such information could be used to discriminate against Americans on issues such as housing and employment even as it makes their lives easier in many ways. "Big data" is everywhere.
The only event dedicated to the next generation of lab informatics applications and building a searchable, shareable database to improve decision making and efficiency. After thesuccess of last year’s inaugural event ELNs, Data Analytics and Knowledge Management event in the US, Pharma IQ has announced the second EDKM conference to be held on 17th & 18th June 2014 in Boston US.
Researchers have found a way for computers to recognize 21 distinct facial expressions - even expressions for complex or seemingly contradictory emotions such as “happily disgusted” or “sadly angry.”
The IDC HPC User Forum will meet at The Grand Hyatt Seattle, September 15 to 17, 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.
Ryan Kennedy, University of Houston political science professor, and his co-researchers detail new research about the problematic use of big data from aggregators such as Google’s Google Flu Trend. Numbers and data can be critical tools in bringing complex issues into a crisp focus. The understanding of diseases, for example, benefits from algorithms that help monitor their spread. But without context, a number may just be a number
Ben Recht is looking for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor because, he says, the researchers, engineers and business people he has worked with usually don’t know enough about their data to reach their goal.
Next week, Scientific Computing will host a live panel discussion that looks at how a unique supercomputing system, created to serve the needs of a scientific community alliance in seven northern German states, has unified datacenter resources to address big data challenges. By streamlining the analysis process through automation, the HLRN alliance has improved performance and increased accuracy, resulting in greater efficiency.
NASA’s Asteroid Data Hunter contest series will offer $35,000 in awards over the next six months to citizen scientists who develop improved algorithms that can be used to identify asteroids. This contest is being conducted in partnership with Planetary Resources of Bellevue, WA.
How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We started talking about this concept of Big Compute back in Oct. 2012. In many ways, it’s the collision of where HPC is meeting the challenges of Big Data. As our technical capabilities continue to expand in the ways we can collect and store data, the problem of how we access and use data is only growing.
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding, Cray | Blogs | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing. The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.
In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
- Page 1