The ISC Big Data Conference was organized for the first time last year, and it brought together an interesting mix of users and vendors from the enterprise and HPC communities. This year, ISC is expecting key technical experts and IT decision-makers in the big data domain. The conference will be held in Heidelberg at the Marriott Hotel, preceded by ISC Cloud’14.
IBM has debuted new Power Systems servers that allow data centers to manage staggering data...
Over the last few decades researchers have characterized a set of clock genes that drive daily...
A team led by Houston Methodist Research Institute (HMRI) scientists has found that Alzheimer's...
David A. Bader is a Full Professor in the School of Computational Science and Engineering, College of Computing, at Georgia Institute of Technology, and Executive Director for High Performance Computing. He received his Ph.D. in 1996 from The University of Maryland, and his research is supported through highly-competitive research awards, primarily from NSF, NIH, DARPA, and DOE.
Mr. Wierse studied Mathematics at Bonn University and briefly worked at the Institute for Applied Mathematics, before he 1991 moved to Stuttgart, to work on his PhD in the visualization department at the Computing Centre of the University. In 1997 he founded together with his colleagues the start-up VirCinity (later Visenso) and co-ordinated as managing director the commercialization of the COVISE visualization software
Mahdi Bohlouli is a Ph.D. candidate a the University of Siegen. His research interests include Cloud & Grid Computing and Distributed Systems, Knowledge Representation and Modeling, and Big Data.
The evolution of cluster technologies is expected to substantially impact emerging research areas, such as the increasingly important Data Science field. Therefore, we have chosen this year to highlight research topics expected to bring substantial progress in the way clusters can help in addressing Big Data challenges. Specific topics are dedicated to this direction within all conference tracks alongside more traditional topics. In addition, special tutorials and workshops will focus on cluster technologies for Big Data storage and processing.
Technology Academy Finland (TAF) has declared innovator Prof. Stuart Parkin as winner of the 2014 Millennium Technology Prize, the prominent award for technological innovation. Parkin receives the Prize in recognition of his discoveries, which have enabled a thousand-fold increase in the storage capacity of magnetic disk drives. Parkin’s innovations have led to a huge expansion of data acquisition and storage capacities
Is Big Data really the biggest challenge at the moment for translational science? Certainly there are issues with the complexity and size of omics data, which Big Data techniques can help address, but there are two more pressing challenges: enabling collaboration whilst facilitating information sharing, and the ability to better interpret multiple different omics data (multi-omics).
The HPC Advisory Council, an organization for high-performance computing research, outreach and education, has announced that the HPC Advisory Council, in conjunction with ISC’14, will host the 5th Annual HPC Advisory Council European Workshop 2014 in the Congress Center Leipzig on June 22, 2014. The workshop will focus on HPC productivity, and advanced HPC topics and futures
Today's enterprises face unique challenges. In the past, the requirement was to upgrade. Today, it's about building an integrated strategy that involves multiple technologies both existing and new. For example, there's more diversity in database technology than ever before, server technology and data center infrastructure, to name a few. At the moment, none of these technologies are replacing the others; instead, they need to be integrated.
Advance registration at reduced rates is now open for the 2014 International Supercomputing Conference (ISC’14), which will be held June 22-26 in Leipzig, Germany. By registering now, ISC’14 attendees can save over 25 percent off the onsite registration rates at the Leipzig Congress Center.
Ryan Kennedy, University of Houston political science professor, and his co-researchers detail new research about the problematic use of big data from aggregators such as Google’s Google Flu Trend. Numbers and data can be critical tools in bringing complex issues into a crisp focus. The understanding of diseases, for example, benefits from algorithms that help monitor their spread. But without context, a number may just be a number
Ben Recht is looking for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor because, he says, the researchers, engineers and business people he has worked with usually don’t know enough about their data to reach their goal.
Businesses increasingly report that they are able to boost their productivity and competitiveness in the global market by deploying computer simulations and digital modeling. Such applications require high-end computing power and storage that are provided by HPC products and services. The ISC’14 two-day Industry Innovation through HPC track is designed to help engineers, manufacturers and designers gain the right set of tools and methods
Next week, Scientific Computing will host a live panel discussion that looks at how a unique supercomputing system, created to serve the needs of a scientific community alliance in seven northern German states, has unified datacenter resources to address big data challenges. By streamlining the analysis process through automation, the HLRN alliance has improved performance and increased accuracy, resulting in greater efficiency.
How can organizations embrace — instead of brace for v the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning as little more than expensive storage space.
At Cycle Computing we’re seeing several large trends as it relates to Big Data and Analytics. We started talking about this concept of Big Compute back in Oct. 2012. In many ways, it’s the collision of where HPC is meeting the challenges of Big Data. As our technical capabilities continue to expand in the ways we can collect and store data, the problem of how we access and use data is only growing.
Scalable Productivity and the Ever-Increasing Tie between Data Analytics, Data Management and ComputationMarch 7, 2014 3:52 pm | by Barry Bolding, Cray | Blogs | Comments
Cray continues to see an increasing trend in the HPC marketplace that we are calling “data-intensive” supercomputing. The dramatic growth in scientific, commercial and social data is resulting in an expanded customer base that is asking for much more complex analysis and simulation.
In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”
From the start of the supercomputer era in the 1960s — and even earlier —an important subset of HPC jobs has involved analytics — attempts to uncover useful information and patterns in the data itself. Cryptography, one of the original scientific-technical computing applications, falls predominantly into this category.
Steve Conway, IDC VP HPC explains that, to date, most data-intensive HPC jobs in the government, academic and industrial sectors have involved the modeling and simulation of complex physical and quasi-physical systems. However, he notes that from the start of the supercomputer era in the 1960s — and even earlier — an important subset of HPC jobs has involved analytics, attempts to uncover useful information and patterns in the data itself.
Big Data tools such as Grok and IBM Watson are enabling large organizations to behave more like agile startups. Of the transformative technology developments that have ushered in the current frenzy of activity along the information superhighway, the 1994 invention of the “Wiki” by Ward Cunningham is among the most disruptive.
The 10-day tour of Europe was not your typical itinerary — Garching, Karlsruhe, Villigen, Hamburg and Oxford. In January. But David Brown and Craig Tull of the Computational Research Division and Alex Hexemer of the Advanced Light Source weren’t touring to see the sights — they more interested in seeing the lights — powerful scientific instruments known as light sources that use intense X-rays to study materials
Size alone does not define big data — it is best defined as a combination of volume, velocity, variety and value.Kevin Geraghty, head of Analytics 360i defined the goal of big data analytics well when he said: “We are trying to listen to what the customer is telling us through their behavior.” The goal of big data analytics is to make the best business decisions possible.
Just as Netflix uses an algorithm to recommend movies we ought to see, a Stanford software system offers by-the-moment advice to thousands of server-farm computers on how to efficiently share the workload. We hear a lot about the future of computing in the cloud, but not much about the efficiency of the data centers that make the cloud possible, where clusters work together to host applications ranging from big data analytics
The Russian Ministry of Education and Science has awarded a $3.4 million “mega-grant” to Alexei Klimentov, Physics Applications Software Group Leader at the U.S. Department of Energy’s Brookhaven National Laboratory, to develop new “big data” computing tools for the advancement of science.
AT&T and IBM have announced a new global alliance agreement to develop solutions that help support the "Internet of Things." The companies will combine their analytic platforms, cloud and security technologies with privacy in mind to gain more insights on data collected from machines in a variety of industries.
- Page 1