Advertisement
HPC
Subscribe to HPC

The Lead

New software algorithms reduce the time and material needed to produce objects with 3-D printers. Here, the wheel on the left was produced with conventional software and the one on the right with the new algorithms. Courtesy of Purdue University/Bedrich B

New Software Algorithms Speed 3-D Printing, Reduce Waste

October 22, 2014 12:40 pm | by Emil Venere, Purdue University | News | Comments

New software algorithms have been shown to significantly reduce the time and material needed to produce objects with 3-D printers. The algorithms have been created to address the problem. Researchers from Purdue University have demonstrated one approach that has been shown to reduce printing time by up to 30 percent and the quantity of support material by as much as 65 percent.

Quantum Holograms could become Quantum Information Memory

October 22, 2014 12:22 pm | by Springer | News | Comments

Russian scientists have developed a theoretical model of quantum memory for light, adapting the...

Counter-measure Offers Cyber Protection for Supply Chains

October 22, 2014 10:14 am | by University of Maryland | News | Comments

The supply chain is ground zero for several recent cyber breaches. Hackers, for example, prey on...

Light-enabled Wi-fi could Tackle Global Data Crunch

October 22, 2014 9:50 am | by University of Edinburgh | News | Comments

High-speed bi-directional wireless technology that uses light to send information securely...

View Sample

FREE Email Newsletter

LLNL researcher Monte LaBute was part of a Lab team that recently published an article in PLOS ONE detailing the use of supercomputers to link proteins to drug side effects. Courtesy of Julie Russell/LLNL

Supercomputers Link Proteins to Adverse Drug Reactions

October 21, 2014 10:40 am | by Kenneth K Ma, Lawrence Livermore National Laboratory | News | Comments

The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.

Diver collecting microbial samples from Australian seaweeds for Uncovering Genome Mysteries

Crowdsourced Supercomputing Examines Big Genomic Data

October 21, 2014 9:31 am | by IBM | News | Comments

What do the DNA in Australian seaweed, Amazon River water, tropical plants, and forest soil all have in common? Lots, say scientists. And understanding the genetic similarities of disparate life forms could enable researchers to produce compounds for new medicines, eco-friendly materials, more resilient crops, and cleaner air, water and energy.

R.D. McDowall is Principal, McDowall Consulting.

The Cloud Meets GMP Regulations – Part 1: Applicable Regulations

October 20, 2014 2:35 pm | by R D McDowall | Articles | Comments

The purpose of this series is to discuss the impact of GMP (Good Manufacturing Practice) regulations on cloud computing and to debate some of the regulatory issues facing an organization contemplating this approach. In this part, we look at the applicable regulations.

Advertisement
The HPCAC-ISC Student Cluster Competition is designed to introduce the next generation to the international high performance computing (HPC) community. This is an excellent educational opportunity for students around the world to showcase their knowledge

HPCAC-ISC 2015 Student Cluster Competition Accepting Undergraduate Applications

October 20, 2014 11:11 am | by ISC | News | Comments

The HPC Advisory Council and ISC High Performance call on undergraduate students from around the world to submit their application for partaking in the 2015 Student Cluster Competition (SCC). The 11 teams selected will receive the opportunity to build a small cluster of their design and run a series of benchmarks and applications in real time for four days, on the ISC 2015 exhibition floor.

An IBM logo displayed in Berlin, VT. IBM is paying $1.5 billion to Globalfoundries in order to shed its costly chip division. (AP Photo/Toby Talbot)

IBM to Pay $1.5B to Shed Costly Chip Division

October 20, 2014 10:54 am | by Michelle Chapman, AP Business Writer | News | Comments

IBM will pay $1.5 billion to Globalfoundries in order to shed its costly chip division. IBM Director of Research John E. Kelly III said in an interview on October 20, 2104, that handing over control of the semiconductor operations will allow it to grow faster, while IBM continues to invest in and expand its chip research.

ESnet installed its first European network node at CERN (the major laboratory outside Geneva that houses the LHC) in mid-September, and is now deploying other equipment necessary to bring the first link online by October. The plan is for all links to be c

DOE’s High-Speed Network to Boost Big Data Transfers by Extending 100G Connectivity across Atlantic

October 20, 2014 10:44 am | by ESnet | News | Comments

The DOE’s Energy Sciences Network, or ESnet, is deploying four new high-speed transatlantic links, giving researchers at America’s national laboratories and universities ultra-fast access to scientific data from the Large Hadron Collider and other research sites in Europe. ESnet’s transatlantic extension will deliver a total capacity of 340 gigabits-per-second, and serve dozens of scientific collaborations.

DOE’s High-Speed Network to Boost Big Data Transfers by Extending 100G Connectivity across Atlantic

Meeting the Technical Challenges of Transatlantic Connectivity

October 20, 2014 10:39 am | by ESnet | News | Comments

When ESnet engineers began to study the idea of building a new 100 Gbps network between the US and Europe, a primary concern was ensuring the service would be robust and built from multiple underlying links — so that if one went down, researchers could still rely on sufficient bandwidth. Based on data collected by Caltech physicist and networking pioneer Harvey Newman, the team understood multiple cables are sometimes cut simultaneously.

Bathymetry image of Lake George: In 2014, a bathymetric and topographic survey conducted by boat and plane mapped the lake bed, shoreline and watershed. Now, within the data visualization center, scientists will be able to zoom in as close as half a meter

State-of-the-Art Visualization Lab to Display Streaming Data in Real-Time

October 20, 2014 10:00 am | by IBM | News | Comments

The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.

Advertisement
Shown here is a square-centimeter chip containing the nTron adder, which performed the first computation using the researchers' new superconducting circuit. Courtesy of Adam N. McCaughan

Nanocryotron could Unlock Power of Superconducting Computer Chips

October 17, 2014 10:43 am | by Larry Hardesty, MIT | News | Comments

Computer chips with superconducting circuits — circuits with zero electrical resistance — would be 50 to 100 times as energy-efficient as today’s chips, an attractive trait given the increasing power consumption of the massive data centers that power the Internet’s most popular sites. Superconducting chips also promise greater processing power.

SGI UV for SAP HANA

SGI UV for SAP HANA

October 17, 2014 10:30 am | Sgi | Product Releases | Comments

SGI UV for SAP HANA is a purpose-built, in-memory computing appliance for growing environments running the SAP HANA platform. SAP-certified and available as a 4- or 8-socket single-node system with up to 6 TBs of in-memory computing, the appliance is designed to enable the largest enterprises to achieve real-time operations and business breakthroughs with SAP HANA at extreme scale

Urika-XA System for Big Data Analytics

Cray Urika-XA System for Big Data Analytics

October 16, 2014 9:53 am | Cray Inc. | Product Releases | Comments

The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-integrated with the Apache Hadoop and Apache Spark frameworks. It is designed to provide users with the benefits of a turnkey analytics appliance combined with a flexible, open platform that can be modified for future analytics workloads.

Winning NVIDIA’s 2014 Early Stage Challenge helped GPU-powered startup Map-D bring interactivity to big data in vivid ways.

Hot Young Startups Vie for $100,000 GPU Challenge Prize

October 16, 2014 9:24 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which takes place as part of its Emerging Companies Summit (ECS). In this seventh annual contest, hot young startups using GPUs vie for a single $100,000 grand prize.

10/40/56 Gigabit Ethernet Switches for Hyperscale and Cloud Data Centers

10/40/56 Gigabit Ethernet Switches for Hyperscale and Cloud Data Centers

October 15, 2014 3:53 pm | Mellanox Technologies, Inc. | Product Releases | Comments

SX1400, SX1700 and SX1710 Ethernet switches are top-of-rack Open Ethernet software-defined networking (SDN) 10/40/56 Gigabit switches. Based on Mellanox’s SwitchX-2 switch ICs, they provide enhanced control plane capabilities and allow for the design of hyperscale data center networks and control-intensive cloud applications.

Advertisement
IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.

Software Defined Storage a Tipping Point in Taming Big Data Deluge

October 14, 2014 4:57 pm | by IBM | News | Comments

In a keynote speech at IBM Enterprise, Jamie Thomas, General Manager, Storage and Software Defined Systems at IBM, unveiled a bold strategy for the company’s storage business. Building upon the Software Defined Storage portfolio announced last May, IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.

Artist impression of an electron wave function (blue), confined in a crystal of silicon-28 atoms (black), controlled by a nanofabricated metal gate (silver). Courtesy of Dr. Stephanie Simmons/UNSW

New Records: Qubits Process Quantum Data with More than 99% Accuracy

October 14, 2014 4:04 pm | by UNSW Australia | News | Comments

Two research teams have found distinct solutions to a critical challenge that has held back the realization of super powerful quantum computers. The teams, working in the same laboratories at UNSW Australia, created two types of quantum bits, or "qubits" — the building blocks for quantum computers — that each process quantum data with an accuracy above 99 percent.

An innovative piece of research looks into the matter of machine morality, and questions whether it is “evil” for robots to masquerade as humans.

How to Train your Robot: Can We Teach Robots Right from Wrong?

October 14, 2014 12:46 pm | by Taylor & Francis | News | Comments

From performing surgery to driving cars, today’s robots can do it all. With chatbots recently hailed as passing the Turing test, it appears robots are becoming increasingly adept at posing as humans. While machines are becoming ever more integrated into human lives, the need to imbue them with a sense of morality becomes increasingly urgent. But can we really teach robots how to be good? An innovative piece of research looks into the matter

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science

2015 Rice Oil & Gas High Performance Computing Workshop

October 13, 2014 2:45 pm | by Rice University | Events

The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.

On Tuesday, October 7, in New York City, IBM Watson Group Senior Vice President Mike Rhodin and travel entrepreneur Terry Jones attended the opening of IBM Watson's global headquarters in New York City's Silicon Alley. Terry Jones is launching a new compa

IBM Watson Fuels Next Generation of Cognitive Computing

October 13, 2014 11:32 am | by IBM | News | Comments

Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

High Performance Parallelism Pearls: A Teaching Juggernaut

October 13, 2014 9:52 am | by Rob Farber | Blogs | Comments

High Performance Parallelism Pearls, the latest book by James Reinders and Jim Jeffers, is a teaching juggernaut that packs the experience of 69 authors into 28 chapters designed to get readers running on the Intel Xeon Phi family of coprocessors, plus provide tools and techniques to adapt legacy codes, as well as increase application performance on Intel Xeon processors. 

Male Scarlet Tanager (Piranga olivacea). The Scarlet Tanager is a vibrant songster of eastern hardwood forests. Widespread breeders in the East, they are long-distance migrants that move all the way to South America for the winter. Courtesy of Kelly Colga

NSF Awards $15 Million to Environmental Science Data Project

October 10, 2014 3:38 pm | by NSF | News | Comments

As with the proverbial canary in the coal mine, birds serve as an indicator of the health of our environment. Many common species have experienced significant population declines within the last 40 years. Suggested causes include habitat loss and climate change, however to fully understand bird distribution relative to the environment, extensive data are needed.

The system is being used to cool Magnus, one of the center's supercomputers, which is able to deliver processing power in excess of a petaflop. Courtesy of iVEC

Pawsey Magnus Supercomputer Utilizing Water-saving Groundwater System

October 10, 2014 12:27 pm | by Teresa Belcher, ScienceNetwork WA | News | Comments

More than 2.8 megaliters of water has been saved in just under a year using groundwater to cool the Pawsey Centre supercomputer in Perth.To make that happen, scientists have undertaken stringent tests to ensure that returning heated water to the Mullalloo aquifer has no adverse effects.

NASA’s Traffic and Atmospheric Information for General Aviation (TAIGA) technology system is capable of showing pilots the altitude of nearby terrain via color. Yellow identifies terrain that is near the aircraft’s altitude and red shows the terrain that

New NASA Technology Brings Critical Data to Pilots over Remote Alaskan Territories

October 10, 2014 11:58 am | by NASA | News | Comments

NASA has formally delivered to Alaskan officials a new technology that could help pilots flying over the vast wilderness expanses of the northern-most state. The technology is designed to help pilots make better flight decisions, especially when disconnected from the Internet, telephone, flight services and other data sources normally used by pilots.

Crops growing in an Egyptian oasis, with the Pyramids of Giza in the background. Courtesy of Purdue University

Powerful Web-based Geospatial Data Project puts Major Issues on the Map

October 9, 2014 2:02 pm | by NSF | News | Comments

Technology is putting complex topics like severe weather and climate change on the map — literally. Mapping data associated with specific geographic locations is a powerful way to glean new and improved knowledge from data collections and to explain the results to policymakers and the public. Particularly useful is the ability to layer different kinds of geospatial data on top of one another and see how they interact.

The Columbia Supercomputer at NASA's Advanced Supercomputing Facility at Ames Research Center. Courtesy of Trower, NASA

SC14 Plenary, led by SGI and NASA, to Focus on Importance of Supercomputers in Society

October 9, 2014 12:18 pm | by SC14 | News | Comments

Supercomputing 2014 (SC14) is announcing a new “HPC Matters” plenary that will examine the important roles that high performance computing (HPC) plays in every aspect of society from simplifying manufacturing to tsunami warning systems and hurricane predictions to improving care for cancer patients.

A comparison of two weather forecast models for the New Jersey area. At left shows the forecast that doesn't distinguish local hazardous weather. At right shows the High Resolution Rapid Refresh (HRRR) model that clearly depicts where local thunderstorms,

Weather Service Storm Forecasts Get More Localized

October 8, 2014 11:53 am | by Seth Borenstein, AP Science Writer | News | Comments

The next time some nasty storms are heading your way, the National Weather Service says it will have a better forecast of just how close they could come to you. The weather service started using a new high-resolution computer model that officials say will dramatically improve forecasts for storms up to 15 hours in advance. It should better pinpoint where and when tornadoes, thunderstorms and blizzards are expected.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading