Florida Polytechnic University, Flagship Solutions Group and IBM have announced a new supercomputing center at the University composed of IBM high performance systems, software and cloud-based storage, to help educate students in emerging technology fields. Florida Polytechnic University is the newest addition to the State University System and the only one dedicated exclusively to science, technology, engineering and mathematics (STEM).
NCSA’s Blue Waters project will offer a graduate course on High Performance Visualization for...
In a society that has to understand increasingly big and complex datasets, EU researchers are...
The Michael J. Fox Foundation for Parkinson's Research (MJFF) and Intel have announced a...
With five technical papers contending for one of the highest honored awards in high performance computing (HPC), the Association for Computing Machinery’s (ACM) awards committee has four months left to choose a winner for the prestigious 2014 Gordon Bell Prize. The winner of this prize will have demonstrated an outstanding achievement in HPC that helps solve critical science and engineering problems.
Cambridge UK-based start up Optalysys has stated that it is only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer. The company will demonstrate its prototype, which meets NASA Technology Readiness Level 4, in January of next year.
Proficy HMI/SCADA - iFIX 5.8 is designed to enable users to drive better analytics and leverage more reliability, flexibility and scalability across the enterprise. The real-time information management and SCADA solution includes latest-generation visualization tools and a control engine.
FLOW-3D 11 features FlowSight, an advanced visualization tool based on the EnSight post-processor, which offers powerful ways to analyze, visualize and communicate simulation data. Its capabilities include the ability to analyze and compare multiple simulation results simultaneously, volume rendering and a CFD calculator, as well as flipbooks.
IBM is making high performance computing more accessible through the cloud for clients grappling with big data and other computationally intensive activities. A new option from SoftLayer will provide industry-standard InfiniBand networking technology to connect SoftLayer bare metal servers. This will enable very high data throughput speeds between systems, allowing companies to move workloads traditionally associated with HPC to the cloud.
IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud, that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.
How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...
In nearly every field of science, experiments, instruments, observations, sensors, simulations, and surveys are generating massive data volumes that grow at exponential rates. Discoverable, shareable data enables collaboration and supports repurposing for new discoveries — and for cross-disciplinary research enabled by exchange across communities that include both scientists and citizens.
Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.
Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.
To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.
Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution to a given problem.
The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.
IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year
HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.
In a sport where milliseconds matter, the 2012 U.S. Women’s Olympic cycling team found their competitive edge in an unlikely place – data science. The team went from a five-second deficit at the world championships to earning a Silver medal in the 2012 London Olympics — a triumphant feat that was achieved not only through dedication and athletic ability, but also through enhancing training with insights gained from analyzing big data.
Technology entrepreneurs wake up every morning with the goal of creating innovations that can change the world. IBM has announced a new class of innovators that are making their visions a reality by creating apps fueled by Watson's cognitive computing intelligence.
Elastic Storage is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. The technology allows enterprises to exploit the exploding growth of data in a variety of forms generated by devices, sensors, business processes and social networks.
Spotfire 6.5 analytics platform allows users to easily connect to diverse data sources, including spatial data sources, and create rich visualizations, enabling analytics from the simplest to the most complex levels. Features include the single-seat Spotfire desktop product, which provides the full power and ease of use of the Spotfire platform for individual users...
The IDC HPC User Forum will meet at HLRS in Stuttgart and another location in October 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.
A White House review of how the government and private sector use large sets of data has found that such information could be used to discriminate against Americans on issues such as housing and employment even as it makes their lives easier in many ways. "Big data" is everywhere.
The only event dedicated to the next generation of lab informatics applications and building a searchable, shareable database to improve decision making and efficiency. After thesuccess of last year’s inaugural event ELNs, Data Analytics and Knowledge Management event in the US, Pharma IQ has announced the second EDKM conference to be held on 17th & 18th June 2014 in Boston US.
Researchers have found a way for computers to recognize 21 distinct facial expressions - even expressions for complex or seemingly contradictory emotions such as “happily disgusted” or “sadly angry.”
The IDC HPC User Forum will meet at The Grand Hyatt Seattle, September 15 to 17, 2014. HPC User Forum meetings are open to anyone with an interest in high performance computing or high performance data analysis (big data using HPC), including users, vendors, funders, and others.
Ryan Kennedy, University of Houston political science professor, and his co-researchers detail new research about the problematic use of big data from aggregators such as Google’s Google Flu Trend. Numbers and data can be critical tools in bringing complex issues into a crisp focus. The understanding of diseases, for example, benefits from algorithms that help monitor their spread. But without context, a number may just be a number
- Page 1