Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory FacilityApril 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments
The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.
A new technique of visualizing the complicated relationships between anything from Facebook...
For many computationally-intensive applications, such as simulation, seismic processing and...
Seagate Technology has announced that four Cray customers will be among the first to implement...
Peter Ziegenhein is a Post-Doctoral Training Fellow at the Institute of Cancer Research · Joint Department of Physics (with The Royal Marsden Hospital).
Konstantin S. Solnushkin received his B.S. and M.S. degrees with honors from Saint Petersburg State Polytechnic University, Russia, in 2003 and 2005, respectively. He is currently working towards his Ph.D. His research interests include economics of high performance computing and automated design of computer clusters. ISC'15 Research Paper Committee
Qing Gary Liu is a Research Scientist at Oak Ridge National Laboratory.
Daniel Hackenberg, Technical University Dresden, Germany
JProf. Dr. Holger Fröning is Associate Professor for Computer Engineering at Heidelberg University and Visiting Professor at Graz University of Technology. ISC’15 Research Paper Committee
Thomas Bönisch is Head of Project & User Management at HLRS, Germany.
Quantum Computing has been a concept since the 1980s that has remained outside the domain of real-world HPC. Through the era of Moore’s Law and exponential progress in feature size, clock rates and resulting performance, the need for alternative paradigms and technologies has attracted little interest. But there has remained a curiosity among a limited community that has driven slow but persistent advances in associated ideas.
The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is now accepting proposals for high-impact, computationally intensive research campaigns in a broad array of science, engineering and computer science domains.
There are a number of excellent commercial performance analysis tools on the market. Their big drawback is that they cost money. As a result, acquisition of commercial performance analysis software falls through the cracks, as most funding agencies discourage or prohibit the use of grant money for infrastructure improvements, and few grant authors are willing to take money away from research. Open-source tools are able to fill this gap.
Quantum Cryptography at the Speed of Light: First All-photonic Repeaters enable Quantum TeleportationApril 16, 2015 12:53 pm | by Marit Mitchell, University of Toronto | News | Comments
Imagine having your MRI results sent directly to your phone, with no concern over the security of your private health data. Or knowing your financial information was safe on a server halfway around the world. Or sending highly sensitive business correspondence, without worrying that it would fall into the wrong hands.
Ryft One is an open platform to analyze streaming, historical, unstructured, and multi-structured data in real-time. It is a commercial 1U platform capable of providing fast and actionable business insights by analyzing both historical and streaming data at an unprecedented 10 Gigabytes/second or faster.
ISC High Performance has extended the deadline to apply for the ISC Student Volunteer Program. The new deadline is April 30, 2015. More volunteers are needed this year, as the conference will be hosting a larger number of sessions than in previous years, and the student volunteer program is critical in helping to run the conference as smoothly as possible.
IBM scientists have demonstrated an areal recording density of 123 billion bits of uncompressed data per square inch on low-cost, particulate magnetic tape, a breakthrough which represents the equivalent of a 220 terabyte tape cartridge that could fit in the palm of your hand.
Most of the principles behind supercomputing were set in place during the 1940s, so it is not surprising they are in need of rethinking or rebooting. Turing’s computability theory helped many people discover the potential of computers and, with von Neumann’s concept of software stored in memory, enabled some of those people to program computers with easy-to-understand abstractions. This ultimately led to a long period of growth...
Intel has announced that the U.S. Department of Energy’s (DOE) Argonne Leadership Computing Facility (ALCF) has awarded Intel Federal LLC, a wholly-owned subsidiary of Intel Corporation, a contract to deliver two next-generation supercomputers to Argonne National Laboratory.
Researchers have long believed that supercomputers give universities a competitive edge in scientific research, but now they have some hard data showing it’s true. A Clemson University team found that universities with locally available supercomputers were more efficient in producing research in critical fields than universities that lacked supercomputers.
The core circuits of quantum teleportation, which generate and detect quantum entanglement, have been successfully integrated into a photonic chip by an international team of scientists from the universities of Bristol, Tokyo, Southampton and NTT Device Technology Laboratories. These results pave the way to developing ultra-high-speed quantum computers and strengthening the security of communication.
The world's most powerful particle accelerator began its second act on April 5. After two years of upgrades and repairs, proton beams once again circulated around the Large Hadron Collider, located at the CERN laboratory near Geneva. With the collider back in action, the more than 1,700 U.S. scientists are prepared to join thousands of their international colleagues to study the highest-energy particle collisions ever achieved.
Efficient, Time Sensitive Execution of Next-gen Sequencing Pipelines Critical for Translational MedicineApril 6, 2015 3:26 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments
Demand for genomics processing is rapidly spreading from research labs to the clinical arena. Genomics is now a "must have" tool for researchers in areas of oncology and rare diseases. It is also becoming a requirement in the clinical space for precision medicine, translational medicine and similar "bench to bedside" initiatives.
It’s almost a rite of passage in physics and astronomy. Scientists spend years scrounging up money to build a fantastic new instrument. Then, when the long-awaited device finally approaches completion, the panic begins: How will they handle the torrent of data? The Square Kilometer Array will have an unprecedented ability to deliver data on the location and properties of stars, galaxies and giant clouds of hydrogen gas.
Total has chosen SGI to upgrade its supercomputer Pangea. Total is one of the largest integrated oil and gas companies in the world, with activities in more than 130 countries. Its 100,000 employees put their expertise to work in every part of the industry — the exploration and production of oil and natural gas, refining, chemicals, marketing and new energies. This updated system would place in the top 10 of the latest TOP500 list.
Physicists at the Universities of Bonn and Cambridge have succeeded in linking two completely different quantum systems to one another. In doing so, they have taken an important step forward on the way to a quantum computer.
Researchers from the University of Illinois at Urbana-Champaign are using supercomputing resources at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, to shed light on the mysterious nature of high-temperature superconductors. With critical temperatures ranging from 30 Kelvin to 130 Kelvin, this relatively new class of superconductors is high-temperature in name only.
Supercomputer Calculates Mass Difference between Neutron and Proton, confirms Theory of Strong InteractionMarch 30, 2015 2:45 pm | by Forschungszentrum Jülich | News | Comments
The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Eighty years after the discovery of the neutron, a team of physicists has finally calculated the tiny neutron-proton mass difference. The findings confirm the theory of the strong interaction.
Jürgen Kohler studied Aerospace Engineering at the University of Stuttgart. In 1992 he started his career at the Mercedes Benz AG and became Manager of Crash-Simulation in 1997. From 2001 to 2005 he was Senior Manager for CAE Passive Safety, Durability and NVH, from 2006 to 2010 for CAE Passive Safety, Durability and Stiffness CAE and Test
- Page 1