Advertisement
HPC
Subscribe to HPC

The Lead

R.D. McDowall is Principal, McDowall Consulting.

The Cloud Meets GMP Regulations – Part 1: Applicable Regulations

October 20, 2014 2:35 pm | by R D McDowall | Articles | Comments

The purpose of this series is to discuss the impact of GMP (Good Manufacturing Practice) regulations on cloud computing and to debate some of the regulatory issues facing an organization contemplating this approach. In this part, we look at the applicable regulations.

HPCAC-ISC 2015 Student Cluster Competition Accepting Undergraduate Applications

October 20, 2014 11:11 am | by ISC | News | Comments

The HPC Advisory Council and ISC High Performance call on undergraduate students from around the...

IBM to Pay $1.5B to Shed Costly Chip Division

October 20, 2014 10:54 am | by Michelle Chapman, AP Business Writer | News | Comments

IBM will pay $1.5 billion to Globalfoundries in order to shed its costly chip division. IBM...

DOE’s High-Speed Network to Boost Big Data Transfers by Extending 100G Connectivity across Atlantic

October 20, 2014 10:44 am | by ESnet | News | Comments

The DOE’s Energy Sciences Network, or ESnet, is deploying four new high-speed transatlantic...

View Sample

FREE Email Newsletter

DOE’s High-Speed Network to Boost Big Data Transfers by Extending 100G Connectivity across Atlantic

Meeting the Technical Challenges of Transatlantic Connectivity

October 20, 2014 10:39 am | by ESnet | News | Comments

When ESnet engineers began to study the idea of building a new 100 Gbps network between the US and Europe, a primary concern was ensuring the service would be robust and built from multiple underlying links — so that if one went down, researchers could still rely on sufficient bandwidth. Based on data collected by Caltech physicist and networking pioneer Harvey Newman, the team understood multiple cables are sometimes cut simultaneously.

Bathymetry image of Lake George: In 2014, a bathymetric and topographic survey conducted by boat and plane mapped the lake bed, shoreline and watershed. Now, within the data visualization center, scientists will be able to zoom in as close as half a meter

State-of-the-Art Visualization Lab to Display Streaming Data in Real-Time

October 20, 2014 10:00 am | by IBM | News | Comments

The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.

Shown here is a square-centimeter chip containing the nTron adder, which performed the first computation using the researchers' new superconducting circuit. Courtesy of Adam N. McCaughan

Nanocryotron could Unlock Power of Superconducting Computer Chips

October 17, 2014 10:43 am | by Larry Hardesty, MIT | News | Comments

Computer chips with superconducting circuits — circuits with zero electrical resistance — would be 50 to 100 times as energy-efficient as today’s chips, an attractive trait given the increasing power consumption of the massive data centers that power the Internet’s most popular sites. Superconducting chips also promise greater processing power.

Advertisement
SGI UV for SAP HANA

SGI UV for SAP HANA

October 17, 2014 10:30 am | Sgi | Product Releases | Comments

SGI UV for SAP HANA is a purpose-built, in-memory computing appliance for growing environments running the SAP HANA platform. SAP-certified and available as a 4- or 8-socket single-node system with up to 6 TBs of in-memory computing, the appliance is designed to enable the largest enterprises to achieve real-time operations and business breakthroughs with SAP HANA at extreme scale

Urika-XA System for Big Data Analytics

Cray Urika-XA System for Big Data Analytics

October 16, 2014 9:53 am | Cray Inc. | Product Releases | Comments

The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-integrated with the Apache Hadoop and Apache Spark frameworks. It is designed to provide users with the benefits of a turnkey analytics appliance combined with a flexible, open platform that can be modified for future analytics workloads.

Winning NVIDIA’s 2014 Early Stage Challenge helped GPU-powered startup Map-D bring interactivity to big data in vivid ways.

Hot Young Startups Vie for $100,000 GPU Challenge Prize

October 16, 2014 9:24 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which takes place as part of its Emerging Companies Summit (ECS). In this seventh annual contest, hot young startups using GPUs vie for a single $100,000 grand prize.

10/40/56 Gigabit Ethernet Switches for Hyperscale and Cloud Data Centers

10/40/56 Gigabit Ethernet Switches for Hyperscale and Cloud Data Centers

October 15, 2014 3:53 pm | Mellanox Technologies, Inc. | Product Releases | Comments

SX1400, SX1700 and SX1710 Ethernet switches are top-of-rack Open Ethernet software-defined networking (SDN) 10/40/56 Gigabit switches. Based on Mellanox’s SwitchX-2 switch ICs, they provide enhanced control plane capabilities and allow for the design of hyperscale data center networks and control-intensive cloud applications.

IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.

Software Defined Storage a Tipping Point in Taming Big Data Deluge

October 14, 2014 4:57 pm | by IBM | News | Comments

In a keynote speech at IBM Enterprise, Jamie Thomas, General Manager, Storage and Software Defined Systems at IBM, unveiled a bold strategy for the company’s storage business. Building upon the Software Defined Storage portfolio announced last May, IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.

Advertisement
Artist impression of an electron wave function (blue), confined in a crystal of silicon-28 atoms (black), controlled by a nanofabricated metal gate (silver). Courtesy of Dr. Stephanie Simmons/UNSW

New Records: Qubits Process Quantum Data with More than 99% Accuracy

October 14, 2014 4:04 pm | by UNSW Australia | News | Comments

Two research teams have found distinct solutions to a critical challenge that has held back the realization of super powerful quantum computers. The teams, working in the same laboratories at UNSW Australia, created two types of quantum bits, or "qubits" — the building blocks for quantum computers — that each process quantum data with an accuracy above 99 percent.

An innovative piece of research looks into the matter of machine morality, and questions whether it is “evil” for robots to masquerade as humans.

How to Train your Robot: Can We Teach Robots Right from Wrong?

October 14, 2014 12:46 pm | by Taylor & Francis | News | Comments

From performing surgery to driving cars, today’s robots can do it all. With chatbots recently hailed as passing the Turing test, it appears robots are becoming increasingly adept at posing as humans. While machines are becoming ever more integrated into human lives, the need to imbue them with a sense of morality becomes increasingly urgent. But can we really teach robots how to be good? An innovative piece of research looks into the matter

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science

2015 Rice Oil & Gas High Performance Computing Workshop

October 13, 2014 2:45 pm | by Rice University | Events

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.

On Tuesday, October 7, in New York City, IBM Watson Group Senior Vice President Mike Rhodin and travel entrepreneur Terry Jones attended the opening of IBM Watson's global headquarters in New York City's Silicon Alley. Terry Jones is launching a new compa

IBM Watson Fuels Next Generation of Cognitive Computing

October 13, 2014 11:32 am | by IBM | News | Comments

Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

High Performance Parallelism Pearls: A Teaching Juggernaut

October 13, 2014 9:52 am | by Rob Farber | Blogs | Comments

High Performance Parallelism Pearls, the latest book by James Reinders and Jim Jeffers, is a teaching juggernaut that packs the experience of 69 authors into 28 chapters designed to get readers running on the Intel Xeon Phi family of coprocessors, plus provide tools and techniques to adapt legacy codes, as well as increase application performance on Intel Xeon processors. 

Advertisement
Male Scarlet Tanager (Piranga olivacea). The Scarlet Tanager is a vibrant songster of eastern hardwood forests. Widespread breeders in the East, they are long-distance migrants that move all the way to South America for the winter. Courtesy of Kelly Colga

NSF Awards $15 Million to Environmental Science Data Project

October 10, 2014 3:38 pm | by NSF | News | Comments

As with the proverbial canary in the coal mine, birds serve as an indicator of the health of our environment. Many common species have experienced significant population declines within the last 40 years. Suggested causes include habitat loss and climate change, however to fully understand bird distribution relative to the environment, extensive data are needed.

The system is being used to cool Magnus, one of the center's supercomputers, which is able to deliver processing power in excess of a petaflop. Courtesy of iVEC

Pawsey Magnus Supercomputer Utilizing Water-saving Groundwater System

October 10, 2014 12:27 pm | by Teresa Belcher, ScienceNetwork WA | News | Comments

More than 2.8 megaliters of water has been saved in just under a year using groundwater to cool the Pawsey Centre supercomputer in Perth.To make that happen, scientists have undertaken stringent tests to ensure that returning heated water to the Mullalloo aquifer has no adverse effects.

NASA’s Traffic and Atmospheric Information for General Aviation (TAIGA) technology system is capable of showing pilots the altitude of nearby terrain via color. Yellow identifies terrain that is near the aircraft’s altitude and red shows the terrain that

New NASA Technology Brings Critical Data to Pilots over Remote Alaskan Territories

October 10, 2014 11:58 am | by NASA | News | Comments

NASA has formally delivered to Alaskan officials a new technology that could help pilots flying over the vast wilderness expanses of the northern-most state. The technology is designed to help pilots make better flight decisions, especially when disconnected from the Internet, telephone, flight services and other data sources normally used by pilots.

Crops growing in an Egyptian oasis, with the Pyramids of Giza in the background. Courtesy of Purdue University

Powerful Web-based Geospatial Data Project puts Major Issues on the Map

October 9, 2014 2:02 pm | by NSF | News | Comments

Technology is putting complex topics like severe weather and climate change on the map — literally. Mapping data associated with specific geographic locations is a powerful way to glean new and improved knowledge from data collections and to explain the results to policymakers and the public. Particularly useful is the ability to layer different kinds of geospatial data on top of one another and see how they interact.

The Columbia Supercomputer at NASA's Advanced Supercomputing Facility at Ames Research Center. Courtesy of Trower, NASA

SC14 Plenary, led by SGI and NASA, to Focus on Importance of Supercomputers in Society

October 9, 2014 12:18 pm | by SC14 | News | Comments

Supercomputing 2014 (SC14) is announcing a new “HPC Matters” plenary that will examine the important roles that high performance computing (HPC) plays in every aspect of society from simplifying manufacturing to tsunami warning systems and hurricane predictions to improving care for cancer patients.

A comparison of two weather forecast models for the New Jersey area. At left shows the forecast that doesn't distinguish local hazardous weather. At right shows the High Resolution Rapid Refresh (HRRR) model that clearly depicts where local thunderstorms,

Weather Service Storm Forecasts Get More Localized

October 8, 2014 11:53 am | by Seth Borenstein, AP Science Writer | News | Comments

The next time some nasty storms are heading your way, the National Weather Service says it will have a better forecast of just how close they could come to you. The weather service started using a new high-resolution computer model that officials say will dramatically improve forecasts for storms up to 15 hours in advance. It should better pinpoint where and when tornadoes, thunderstorms and blizzards are expected.

A new principle, called data smashing, estimates the similarities between streams of arbitrary data without human intervention, and without access to the data sources.

Data Smashing Could Unshackle Automated Discovery

October 8, 2014 11:45 am | by Cornell University | News | Comments

A little-known secret in data mining is that simply feeding raw data into a data analysis algorithm is unlikely to produce meaningful results. New discoveries often begin with comparison of data streams to find connections and spot outliers. But most data comparison algorithms today have one major weakness — somewhere, they rely on a human expert. But experts aren’t keeping pace with the complexities of big data.

On Tuesday, October 7, IBM Watson Group Vice President, Client Experience Centers, Ed Harbour opens the IBM Watson global headquarters in New York City's Silicon Alley. (Courtesy Jon Simon/Feature Photo Service for IBM)

IBM Watson Global Headquarters Opens for Business in Silicon Alley

October 8, 2014 10:33 am | by IBM | News | Comments

IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.

Tom Conte is President-elect, IEEE Computer Society; Professor of Computer Science and Electrical & Computer Engineering, Georgia Institute of Technology; and Elie Track is President, IEEE Council on Superconductivity; CEO, nVizix.

Technology Time Machine Looks Far Ahead at Future of Processing

October 7, 2014 3:25 pm | by Tom Conte and Elie Track, IEEE | Blogs | Comments

The IEEE Technology Time Machine (TTM) is going further into the future. Now in its third year, the annual two-day IEEE meeting is mixing things up a little in terms of format and topics. Rather than just looking at how some technologies might evolve in the next decade, experts and visionaries are going to look out to 2035 and beyond.

IBM has announced new capabilities for its System z mainframe.

IBM Delivers New Analytics Offerings for the Mainframe to Provide Real-Time Customer Insights

October 7, 2014 2:09 pm | by IBM | News | Comments

Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.

Storage Server 3 is an open software-defined storage solution

Red Hat Storage Server 3

October 6, 2014 3:09 pm | Red Hat | Product Releases | Comments

Red Hat Storage Server 3 is an open software-defined storage solution for scale-out file storage that is designed for data-intensive enterprise workloads including big data, operational analytics and enterprise file sharing and collaboration. It is based on the open source GlusterFS 3.6 file system and Red Hat Enterprise Linux 6.

The Hewlett-Packard logo is seen outside the company's headquarters in Palo Alto, CA. Hewlett-Packard Co. is splitting itself into two companies, one focused on its personal computer and printing business and another on technology services, such as data s

Hewlett-Packard Splits off PC, Printer Businesses

October 6, 2014 2:47 pm | by AP | News | Comments

Hewlett-Packard is splitting itself into two companies, one focused on its personal computer and printing business and another on technology services, such as data storage, servers and software, as it aims to drive profits higher. Hewlett-Packard, like other PC makers, has been facing changing consumer tastes — moving away from desktops and laptops and toward smartphones and tablets.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading