Advertisement
Analytics
Subscribe to Analytics

The Lead

As the United States pursues the next generation of computing (exascale), new software-centered partnerships could be the key to maximizing economic benefits for Americans

Supporting America’s Economic Competitiveness: A Look at Federal Supercomputing Leadership

October 28, 2014 11:18 am | by Council on Competitiveness | News | Comments

The Council on Competitiveness has released a new report that explores the value of government leadership in supercomputing for industrial competitiveness, titled Solve. The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry. As the federal government pursues exascale computing to achieve national security and science missions, Solve examines how U.S.-based companies also benefit from leading-edge computation

IBM Launches Humanitarian Initiatives to Help Contain Ebola

October 27, 2014 3:16 pm | by IBM | News | Comments

IBM has launched several initiatives to help curb the spread of Ebola in West Africa. They...

Cray Urika-XA System for Big Data Analytics

October 16, 2014 9:53 am | Product Releases | Comments

The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-...

Hot Young Startups Vie for $100,000 GPU Challenge Prize

October 16, 2014 9:24 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which...

View Sample

FREE Email Newsletter

On Tuesday, October 7, in New York City, IBM Watson Group Senior Vice President Mike Rhodin and travel entrepreneur Terry Jones attended the opening of IBM Watson's global headquarters in New York City's Silicon Alley. Terry Jones is launching a new compa

IBM Watson Fuels Next Generation of Cognitive Computing

October 13, 2014 11:32 am | by IBM | News | Comments

Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.

A new principle, called data smashing, estimates the similarities between streams of arbitrary data without human intervention, and without access to the data sources.

Data Smashing Could Unshackle Automated Discovery

October 8, 2014 11:45 am | by Cornell University | News | Comments

A little-known secret in data mining is that simply feeding raw data into a data analysis algorithm is unlikely to produce meaningful results. New discoveries often begin with comparison of data streams to find connections and spot outliers. But most data comparison algorithms today have one major weakness — somewhere, they rely on a human expert. But experts aren’t keeping pace with the complexities of big data.

On Tuesday, October 7, IBM Watson Group Vice President, Client Experience Centers, Ed Harbour opens the IBM Watson global headquarters in New York City's Silicon Alley. (Courtesy Jon Simon/Feature Photo Service for IBM)

IBM Watson Global Headquarters Opens for Business in Silicon Alley

October 8, 2014 10:33 am | by IBM | News | Comments

IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.

Advertisement
IBM has announced new capabilities for its System z mainframe.

IBM Delivers New Analytics Offerings for the Mainframe to Provide Real-Time Customer Insights

October 7, 2014 2:09 pm | by IBM | News | Comments

Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.

NeuroSolutions Infinity

NeuroSolutions Infinity

September 11, 2014 3:58 pm | Neurodimension, Inc. | Product Releases | Comments

NeuroSolutions Infinity predictive data analytics and modeling software is designed to streamline data mining by automatically taking care of the entire data modeling process. It includes everything from accessing, cleaning and arranging data, to intelligently trying potential inputs, preprocessing and neural network architectures, to selecting the best neural network and verifying the results.

The Department of Energy’s Advanced Light Source (ALS) Beamline 7.3.3 (SAXS/WAXS/GISAXS/GIWAXS) and endstation at Lawrence Berkeley National Laboratory. Courtesy of Roy Kaltschmidt

Tools for Reducing, Managing, Analyzing and Visualizing Data Transform Beamline Science

September 10, 2014 3:48 pm | by Lawrence Berkeley National Laboratory | News | Comments

Some mysteries of science can only be explained on a nanometer scale — even smaller than a single strand of human DNA, which is about 2.5 nanometers wide. At this scale, scientists can investigate the structure and behavior of proteins that help our bodies fight infectious microbes, and even catch chemical reactions in action. To resolve these very fine details, they rely on synchrotron light sources like the ALS at Berkeley Lab.

Florida Polytechnic University is the newest addition to the State University System of Florida and the only one dedicated exclusively to science, technology, engineering and mathematics (STEM).

New Supercomputing Center to Support Big Data and Analytics, Cybersecurity and other STEM Skills

August 18, 2014 3:57 pm | by IBM | News | Comments

Florida Polytechnic University, Flagship Solutions Group and IBM have announced a new supercomputing center at the University composed of IBM high performance systems, software and cloud-based storage, to help educate students in emerging technology fields. Florida Polytechnic University is the newest addition to the State University System and the only one dedicated exclusively to science, technology, engineering and mathematics (STEM).

The semester-long online course will include video lectures, quizzes, and homework assignments and will provide students with free access to the Blue Waters supercomputer.

Blue Waters Project to Offer Graduate Visualization Course in Spring 2015

August 18, 2014 12:12 pm | by NCSA | News | Comments

NCSA’s Blue Waters project will offer a graduate course on High Performance Visualization for Large-Scale Scientific Data Analytics in Spring 2015 and is seeking university partners who are interested in offering the course for credit to their students. This semester-long online course will include video lectures, quizzes and homework assignments and will provide students with free access to the Blue Waters supercomputer.

Advertisement
CEEDS — Collective Experience of Empathic Data Systems — is trying to make the subconscious ‘visible’ by gauging our sensory and physiological reactions to the flow of Big Data before us. © CEEDS

CEEDS Project: New Ways of Exploring Big Data

August 14, 2014 3:36 pm | by European Commission, CORDIS | News | Comments

In a society that has to understand increasingly big and complex datasets, EU researchers are turning to the subconscious for help in unraveling the deluge of information. Big Data refers to large amounts of data produced very quickly by a high number of diverse sources. Data can either be created by people or generated by machines, such as sensors gathering climate information, satellite imagery, digital pictures and videos...

Michael J. Fox speaks at Lotusphere 2012. The potential to collect and analyze data from thousands of individuals on measurable features of Parkinson's, such as slowness of movement, tremor and sleep quality, could enable researchers to assemble a better

Intel, Michael J. Fox Foundation Start Smartwatch Study

August 14, 2014 3:25 pm | by Intel | News | Comments

The Michael J. Fox Foundation for Parkinson's Research (MJFF) and Intel have announced a collaboration aimed at improving research and treatment for Parkinson's disease — a neurodegenerative brain disease second only to Alzheimer's in worldwide prevalence. The collaboration includes a multiphase research study using a new big data analytics platform that detects patterns in participant data collected from wearable technologies.

With an emphasis on HPC applications in science, engineering and large-scale data analytics; the Gordon Bell Prize tracks the overall progress in parallel computing.

Finalists Compete for Coveted ACM Gordon Bell Prize in High Performance Computing

August 13, 2014 12:01 pm | by SC14 | News | Comments

With five technical papers contending for one of the highest honored awards in high performance computing (HPC), the Association for Computing Machinery’s (ACM) awards committee has four months left to choose a winner for the prestigious 2014 Gordon Bell Prize. The winner of this prize will have demonstrated an outstanding achievement in HPC that helps solve critical science and engineering problems.

Optalysys is currently developing two products, a ‘Big Data’ analysis system and an Optical Solver Supercomputer, both of which are expected to be launched in 2017.

Light-speed Computing: Prototype Optical Processor Set to Revolutionize Supercomputing

August 8, 2014 4:13 pm | by Optalysys | News | Comments

Cambridge UK-based start up Optalysys has stated that it is only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer. The company will demonstrate its prototype, which meets NASA Technology Readiness Level 4, in January of next year.

Proficy HMI/SCADA iFIX 5.8 Automation Software

Proficy HMI/SCADA iFIX 5.8 Automation Software

August 8, 2014 12:15 pm | Ge Intelligent Platforms | Product Releases | Comments

Proficy HMI/SCADA - iFIX 5.8 is designed to enable users to drive better analytics and leverage more reliability, flexibility and scalability across the enterprise. The real-time information management and SCADA solution includes latest-generation visualization tools and a control engine.

Advertisement
FLOW-3D 11

FLOW-3D 11 Multi-physics Computational Fluid Dynamics Software

August 6, 2014 3:19 pm | Flow Science, Inc. | Product Releases | Comments

FLOW-3D 11 features FlowSight, an advanced visualization tool based on the EnSight post-processor, which offers powerful ways to analyze, visualize and communicate simulation data. Its capabilities include the ability to analyze and compare multiple simulation results simultaneously, volume rendering and a CFD calculator, as well as flipbooks.

IBM Expands High Performance Computing Capabilities in the Cloud

July 24, 2014 2:18 pm | by IBM | News | Comments

IBM is making high performance computing more accessible through the cloud for clients grappling with big data and other computationally intensive activities. A new option from SoftLayer will provide industry-standard InfiniBand networking technology to connect SoftLayer bare metal servers. This will enable very high data throughput speeds between systems, allowing companies to move workloads traditionally associated with HPC to the cloud.

IBM introduces Storage as a Service on SoftLayer for High Performance Data Management in Cloud

July 15, 2014 11:38 am | by IBM | News | Comments

IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud,  that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.

On the Trail of Paradigm-Shifting Methods for Solving Mathematical Models

July 15, 2014 10:11 am | by Hengguang Li | Blogs | Comments

How using CPU/GPU parallel computing is the next logical step - My work in computational mathematics is focused on developing new, paradigm-shifting ideas in numerical methods for solving mathematical models in various fields. This includes the Schrödinger equation in quantum mechanics, the elasticity model in mechanical engineering, the Navier-Stokes equation in fluid mechanics, Maxwell’s equations in electromagnetism...

National Data Service kicks off with the Materials Data Facility

July 9, 2014 4:12 pm | by Amber Harmon | News | Comments

In nearly every field of science, experiments, instruments, observations, sensors, simulations, and surveys are generating massive data volumes that grow at exponential rates. Discoverable, shareable data enables collaboration and supports repurposing for new discoveries — and for cross-disciplinary research enabled by exchange across communities that include both scientists and citizens.

Moab HPC Suite-Enterprise Edition 8.0

July 7, 2014 10:04 am | Adaptive Computing | Product Releases | Comments

Moab HPC Suite-Enterprise Edition 8.0 (Moab 8.0) is designed to enhance Big Workflow by processing intensive simulations and big data analysis to accelerate insights. It delivers dynamic scheduling, provisioning and management of multi-step/multi-application services across HPC, cloud and big data environments. The software suite bolsters Big Workflow’s core services: unifying data center resources, optimizing the analysis process and guaranteeing services to the business.

Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.

Computer-aided Diagnosis of Rare Genetic Disorders from Family Photos

June 30, 2014 11:04 am | by University of Oxford | News | Comments

Computer analysis of photographs could help doctors diagnose which condition a child with a rare genetic disorder has, say Oxford University researchers.                           

To be able to use these huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner.

A Simple Solution for Big Data

June 27, 2014 11:19 am | by SISSA | News | Comments

To be able to use huge amounts of data, we have to understand them and before that we need to categorize them in an effective, fast and automatic manner. Two researchers have devised a type of Cluster Analysis, the ability to group data sets according to their "similarity," based on simple and powerful principles, which proved to be very efficient and capable of solving some of the most typical problems encountered in this type of analysis.

Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution

The Great Salmon Run Algorithm

June 25, 2014 10:42 am | by Inderscience Publishers | News | Comments

Mechanical engineers at the Babol University of Technology in Mazandaran, Iran, have turned to nature to devise an algorithm based on the survival trials faced by salmon swimming upstream to the spawning grounds to help them fish out the optimal solution to a given problem.

Steve Conway is Research VP, IDC High Performance Computing Group.

When Massive Data Never becomes Big Data

June 18, 2014 3:38 pm | by Steve Conway, IDC | Blogs | Comments

The recent PRACE Days conference in Barcelona provided powerful reminders that massive data doesn't always become big data — mainly because moving and storing massive data can cost massive money. PRACE is the Partnership for Advanced Computing in Europe, and the 2014 conference was the first to bring together scientific and industrial users of PRACE supercomputers located in major European nations.

IDC’s new in-depth forecasts are the first that track more than a dozen application and industry segments, including economically important new use cases for HPC.

IDC Announces First In-Depth Forecasts for Worldwide HPC Big Data Market

June 18, 2014 8:57 am | by IDC | News | Comments

IDC has announced the availability of the first in-depth forecasts for high performance data analysis (HPDA), the fast-growing worldwide market for big data workloads that use high performance computing resources. IDC forecasts that the server market for HPDA will grow rapidly at 23.5 percent compound annual growth rate (CAGR) to reach $2.7 billion in 2018 and the related storage market will expand to about $1.6 billion in the same year

Gord Sissons, Product Marketing Manager for IBM Platform Symphony at IBM

The Evolving HPC Cluster: Big Compute meets Big Data

May 28, 2014 4:16 pm | by Gord Sissons, IBM | Blogs | Comments

HPC systems have evolved significantly over the last two decades. While once the dominion of purpose-built supercomputers, today, clustered systems rule the roost. Horizontal scaling has proven to be the most cost-efficient way to increase capacity. What supercomputers all have in common today is their reliance on distributed computing.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading