Advertisement
HPC
Subscribe to HPC

The Lead

The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory Facility

April 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments

The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.

Mathematicians Reduce Big Data Using Ideas from Quantum Theory

April 23, 2015 2:02 pm | by Queen Mary University of London | News | Comments

A new technique of visualizing the complicated relationships between anything from Facebook...

Big Data is Driving HPC to the Cloud

April 21, 2015 2:09 pm | by Leo Reiter, CTO, Nimbix, Inc. | Blogs | Comments

For many computationally-intensive applications, such as simulation, seismic processing and...

Seagate Storage Technology Powering Four Cray Advanced HPC Implementations

April 21, 2015 11:49 am | by Seagate | News | Comments

Seagate Technology has announced that four Cray customers will be among the first to implement...

View Sample

FREE Email Newsletter

Peter Ziegenhein, The Institute of Cancer Research, UK

Peter Ziegenhein

April 20, 2015 4:36 pm | Biographies

Peter Ziegenhein is a Post-Doctoral Training Fellow at the Institute of Cancer Research · Joint Department of Physics (with The Royal Marsden Hospital).

Konstantin S. Solnushkin Saint Petersburg State Polytechnic University

Konstantin S. Solnushkin

April 20, 2015 4:25 pm | Biographies

Konstantin S. Solnushkin received his B.S. and M.S. degrees with honors from Saint Petersburg State Polytechnic University, Russia, in 2003 and 2005, respectively. He is currently working towards his Ph.D. His research interests include economics of high performance computing and automated design of computer clusters. ISC'15 Research Paper Committee

Qing Gary Liu Research Scientist at Oak Ridge National Laboratory

Qing Gary Liu

April 20, 2015 4:07 pm | Biographies

Qing Gary Liu is a Research Scientist at Oak Ridge National Laboratory.

Advertisement
Daniel Hackenberg, TU Dresden

Daniel Hackenberg

April 20, 2015 4:00 pm | Biographies

Daniel Hackenberg, Technical University Dresden, Germany

Holger Froning Heidelberg University

JProf. Dr. Holger Fröning

April 20, 2015 3:55 pm | Biographies

JProf. Dr. Holger Fröning is Associate Professor for Computer Engineering at Heidelberg University and Visiting Professor at Graz University of Technology. ISC’15 Research Paper Committee

Thomas Bönisch

April 20, 2015 3:44 pm | Biographies

Thomas Bönisch is Head of Project & User Management at HLRS, Germany.

Dr. Thomas Sterling holds the position of Professor of Informatics and Computing at the Indiana University (IU) School of Informatics and Computing, as well as serves as Chief Scientist and Executive Associate Director of the Center for Research in Extrem

A Quantum Leap in Computing, Maybe

April 20, 2015 12:07 pm | by Thomas Sterling, Indiana University | Articles | Comments

Quantum Computing has been a concept since the 1980s that has remained outside the domain of real-world HPC. Through the era of Moore’s Law and exponential progress in feature size, clock rates and resulting performance, the need for alternative paradigms and technologies has attracted little interest. But there has remained a curiosity among a limited community that has driven slow but persistent advances in associated ideas.

INCITE seeks research proposals for capability computing: production simulations — including ensembles — that use a large fraction of Leadership Computing Facility systems or require the unique LCF architectural infrastructure for HPC projects that cannot

INCITE Seeking Proposals to Advance Science and Engineering at U.S. Leadership Computing Facilities

April 20, 2015 10:07 am | by Jeff Gary, OLCF | News | Comments

The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is now accepting proposals for high-impact, computationally intensive research campaigns in a broad array of science, engineering and computer science domains.

Advertisement
Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Opening Up Performance with OpenSpeedShop an Open Source Profiler

April 17, 2015 12:12 pm | by Rob Farber | Articles | Comments

There are a number of excellent commercial performance analysis tools on the market. Their big drawback is that they cost money. As a result, acquisition of commercial performance analysis software falls through the cracks, as most funding agencies discourage or prohibit the use of grant money for infrastructure improvements, and few grant authors are willing to take money away from research. Open-source tools are able to fill this gap.

New protocol a major step toward enabling international quantum communications networks over existing optical infrastructure.

Quantum Cryptography at the Speed of Light: First All-photonic Repeaters enable Quantum Teleportation

April 16, 2015 12:53 pm | by Marit Mitchell, University of Toronto | News | Comments

Imagine having your MRI results sent directly to your phone, with no concern over the security of your private health data. Or knowing your financial information was safe on a server halfway around the world. Or sending highly sensitive business correspondence, without worrying that it would fall into the wrong hands.

Ryft One

Ryft One

April 16, 2015 10:34 am | Ryft Systems, Inc. | Product Releases | Comments

Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.

ISC High Performance has extended the deadline to apply for the ISC Student Volunteer Program. ISC will provide out-of-town students with accommodation, as well as most meals.

ISC High Performance Issues Urgent Call for Student Volunteers

April 16, 2015 8:46 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

ISC High Performance has extended the deadline to apply for the ISC Student Volunteer Program. The new deadline is April 30, 2015. More volunteers are needed this year, as the conference will be hosting a larger number of sessions than in previous years, and the student volunteer program is critical in helping to run the conference as smoothly as possible.

The tape path used for data read back in the world record tape demo. On the right, you can see a tape head that writes the data, tape moves to the left, and then on the left, you can see a dimple where the HDD head is reading back the data written (the mi

IBM Research Sets New Record for Tape Storage

April 10, 2015 9:50 am | by IBM | News | Comments

IBM scientists have demonstrated an areal recording density of 123 billion bits of uncompressed data per square inch on low-cost, particulate magnetic tape, a breakthrough which represents the equivalent of a 220 terabyte tape cartridge that could fit in the palm of your hand.

Advertisement
Erik DeBenedictis is on the staff at Sandia National Labs and participates in the IEEE Rebooting Computing initiative and International Technology Roadmap for Semiconductors.

Rebooting Supercomputing

April 10, 2015 9:11 am | by Erik DeBenedictis, IEEE Rebooting Computing Initiative | Blogs | Comments

Most of the principles behind supercomputing were set in place during the 1940s, so it is not surprising they are in need of rethinking or rebooting. Turing’s computability theory helped many people discover the potential of computers and, with von Neumann’s concept of software stored in memory, enabled some of those people to program computers with easy-to-understand abstractions. This ultimately led to a long period of growth...

Argonne’s decision to utilize Intel’s HPC scalable system framework stems from the fact it is designed to deliver a well-balanced and adaptable system capable of supporting both compute-intensive and data-intensive workloads

Intel to Deliver Nation’s Most Powerful Supercomputer at Argonne

April 9, 2015 2:07 pm | by Intel | News | Comments

Intel has announced that the U.S. Department of Energy’s (DOE) Argonne Leadership Computing Facility (ALCF) has awarded Intel Federal LLC, a wholly-owned subsidiary of Intel Corporation, a contract to deliver two next-generation supercomputers to Argonne National Laboratory.

A supercomputer that can do 551 trillion calculations per second is housed at Clemson’s Information Technology Center.

Data-enabled Science: Top500 Supercomputers Provide Universities with Competitive Edge

April 7, 2015 5:02 pm | by Paul Alongi, Clemson University | News | Comments

Researchers have long believed that supercomputers give universities a competitive edge in scientific research, but now they have some hard data showing it’s true. A Clemson University team found that universities with locally available supercomputers were more efficient in producing research in critical fields than universities that lacked supercomputers.

The core circuits of quantum teleportation, which generate and detect quantum entanglement, have been successfully integrated into a photonic chip by an international team of scientists from the universities of Bristol, Tokyo, Southampton and NTT Device T

Quantum Teleportation on Chip Significant Step toward Ultra-High Speed Quantum Computers

April 6, 2015 4:07 pm | by University of Bristol | News | Comments

The core circuits of quantum teleportation, which generate and detect quantum entanglement, have been successfully integrated into a photonic chip by an international team of scientists from the universities of Bristol, Tokyo, Southampton and NTT Device Technology Laboratories. These results pave the way to developing ultra-high-speed quantum computers and strengthening the security of communication.

During the LHC's second run, particles will collide at a staggering 13 teraelectronvolts (TeV), which is 60 percent higher than any accelerator has achieved before.

U.S. Scientists Celebrate Restart of Large Hadron Collider

April 6, 2015 3:46 pm | by Oak Ridge National Laboratory | News | Comments

The world's most powerful particle accelerator began its second act on April 5. After two years of upgrades and repairs, proton beams once again circulated around the Large Hadron Collider, located at the CERN laboratory near Geneva. With the collider back in action, the more than 1,700 U.S. scientists are prepared to join thousands of their international colleagues to study the highest-energy particle collisions ever achieved.

Genomics processing is now moving mainstream to clinical applications, as new approaches to diagnosing and treatment involving genomics are gaining interest.

Efficient, Time Sensitive Execution of Next-gen Sequencing Pipelines Critical for Translational Medicine

April 6, 2015 3:26 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

Demand for genomics processing is rapidly spreading from research labs to the clinical arena. Genomics is now a "must have" tool for researchers in areas of oncology and rare diseases. It is also becoming a requirement in the clinical space for precision medicine, translational medicine and similar "bench to bedside" initiatives.

Hubble telescope image of stars forming inside a cloud of cold hydrogen gas and dust in the Carina Nebula, 7,500 light-years away. Courtesy of Space Telescope Science Institute

Automation Provides Big Data Solution to Astronomy’s Data Deluge

April 2, 2015 9:40 am | by David Tenenbaum, University of Wisconsin-Madison | News | Comments

It’s almost a rite of passage in physics and astronomy. Scientists spend years scrounging up money to build a fantastic new instrument. Then, when the long-awaited device finally approaches completion, the panic begins: How will they handle the torrent of data? The Square Kilometer Array will have an unprecedented ability to deliver data on the location and properties of stars, galaxies and giant clouds of hydrogen gas.

The current Pangea supercomputer is a 2.3 petaflop system based on the Intel Xeon E5-2670 v3 processor that consists of 110,592 cores and contains 442 terabytes of memory built on SGI ICE X, one of the world's fastest commercial distributed memory superco

Total Partners with SGI to Upgrade its Pangea Supercomputer

April 1, 2015 11:27 am | by SGI | News | Comments

Total has chosen SGI to upgrade its supercomputer Pangea. Total is one of the largest integrated oil and gas companies in the world, with activities in more than 130 countries. Its 100,000 employees put their expertise to work in every part of the industry — the exploration and production of oil and natural gas, refining, chemicals, marketing and new energies. This updated system would place in the top 10 of the latest TOP500 list.

The physicists obliged qDots and ions to work together as a team. The hybrid system combines two completely different quantum systems with one another.

Physicists Succeed in Linking Two Completely Different Quantum Systems

March 31, 2015 12:22 pm | by University of Bonn | News | Comments

Physicists at the Universities of Bonn and Cambridge have succeeded in linking two completely different quantum systems to one another. In doing so, they have taken an important step forward on the way to a quantum computer.

Researchers from the University of Illinois at Urbana-Champaign are using Mira to study the magnetic state of iron selenide, a known high-temperature superconductor, at varying levels of pressure. Courtesy of Lucas Wagner, University of Illinois at Urbana

Mira sheds Light on Mysterious Nature of High-temperature Superconductors

March 30, 2015 2:52 pm | by Jim Collins | News | Comments

Researchers from the University of Illinois at Urbana-Champaign are using supercomputing resources at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, to shed light on the mysterious nature of high-temperature superconductors. With critical temperatures ranging from 30 Kelvin to 130 Kelvin, this relatively new class of superconductors is high-temperature in name only.

The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Courtesy of Bergische Universität Wuppertal

Supercomputer Calculates Mass Difference between Neutron and Proton, confirms Theory of Strong Interaction

March 30, 2015 2:45 pm | by Forschungszentrum Jülich | News | Comments

The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Eighty years after the discovery of the neutron, a team of physicists has finally calculated the tiny neutron-proton mass difference. The findings confirm the theory of the strong interaction.

Dr. Jurgen Kohler, Head of NVH CAE and Vehicle Concepts, Daimler AG

Dr. Jurgen Kohler

March 27, 2015 2:35 pm | Biographies

Jürgen Kohler studied Aerospace Engineering at the University of Stuttgart. In 1992 he started his career at the Mercedes Benz AG and became Manager of Crash-Simulation in 1997. From 2001 to 2005 he was Senior Manager for CAE Passive Safety, Durability and NVH, from 2006 to 2010 for CAE Passive Safety, Durability and Stiffness CAE and Test

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading