The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.
What do the DNA in Australian seaweed, Amazon River water, tropical plants, and forest soil all...
The HPC Advisory Council and ISC High Performance call on undergraduate students from around the...
Computer chips with superconducting circuits — circuits with zero electrical resistance — would be 50 to 100 times as energy-efficient as today’s chips, an attractive trait given the increasing power consumption of the massive data centers that power the Internet’s most popular sites. Superconducting chips also promise greater processing power.
The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-integrated with the Apache Hadoop and Apache Spark frameworks. It is designed to provide users with the benefits of a turnkey analytics appliance combined with a flexible, open platform that can be modified for future analytics workloads.
NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which takes place as part of its Emerging Companies Summit (ECS). In this seventh annual contest, hot young startups using GPUs vie for a single $100,000 grand prize.
In a keynote speech at IBM Enterprise, Jamie Thomas, General Manager, Storage and Software Defined Systems at IBM, unveiled a bold strategy for the company’s storage business. Building upon the Software Defined Storage portfolio announced last May, IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.
Two research teams have found distinct solutions to a critical challenge that has held back the realization of super powerful quantum computers. The teams, working in the same laboratories at UNSW Australia, created two types of quantum bits, or "qubits" — the building blocks for quantum computers — that each process quantum data with an accuracy above 99 percent.
Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.
High Performance Parallelism Pearls, the latest book by James Reinders and Jim Jeffers, is a teaching juggernaut that packs the experience of 69 authors into 28 chapters designed to get readers running on the Intel Xeon Phi family of coprocessors, plus provide tools and techniques to adapt legacy codes, as well as increase application performance on Intel Xeon processors.
More than 2.8 megaliters of water has been saved in just under a year using groundwater to cool the Pawsey Centre supercomputer in Perth.To make that happen, scientists have undertaken stringent tests to ensure that returning heated water to the Mullalloo aquifer has no adverse effects.
Supercomputing 2014 (SC14) is announcing a new “HPC Matters” plenary that will examine the important roles that high performance computing (HPC) plays in every aspect of society from simplifying manufacturing to tsunami warning systems and hurricane predictions to improving care for cancer patients.
The next time some nasty storms are heading your way, the National Weather Service says it will have a better forecast of just how close they could come to you. The weather service started using a new high-resolution computer model that officials say will dramatically improve forecasts for storms up to 15 hours in advance. It should better pinpoint where and when tornadoes, thunderstorms and blizzards are expected.
IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.
The IEEE Technology Time Machine (TTM) is going further into the future. Now in its third year, the annual two-day IEEE meeting is mixing things up a little in terms of format and topics. Rather than just looking at how some technologies might evolve in the next decade, experts and visionaries are going to look out to 2035 and beyond.
Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.
Hewlett-Packard is splitting itself into two companies, one focused on its personal computer and printing business and another on technology services, such as data storage, servers and software, as it aims to drive profits higher. Hewlett-Packard, like other PC makers, has been facing changing consumer tastes — moving away from desktops and laptops and toward smartphones and tablets.
Intel and Switch SUPERNAP have partnered to bring Cherry Creek, one of the world’s most powerful supercomputers that’s liquid cooled by CoolIT Systems to the University Nevada Las Vegas (UNLV).
The Cray XC40 supercomputer and CS 400 cluster supercomputer feature the new Intel Xeon processor E5-2600 v3 product family, formerly code named “Haswell." The supercomputer is available with the new DataWarp technology, which is an applications I/O accelerator that addresses the growing performance gap between compute resource and disk-based storage.
The ISC High Performance conference, formerly known as the International Supercomputing Conference, is now open to various submission opportunities. Whether your interest lies in workshops, tutorials, birds-of-a-feather (BoF) sessions, research papers, research posters or in the student volunteer program, ISC is welcoming proposals from all members of the high performance computing (HPC) community.
Cray Inc. has announced that the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) has awarded Cray a $26 million supercomputer contract for a next-generation Cray XC supercomputer and a Cray Sonexion storage system.
Mellanox Technologies a supplier of interconnect solutions for servers and storage systems, has announced that its end-to-end 10 and 40 Gigabit Ethernet interconnect solutions support the recently announced IBTA RoCEv2 specification. The RoCEv2 standard enables routing RDMA traffic across Layer 3 Ethernet networks to address the needs of today’s evolving hyperscale Web 2.0 and cloud deployments.
Computationally intensive research in Sweden will soon get a boost from the fastest academic supercomputer in the Nordic countries, to be installed in October 2014 at KTH Royal Institute of Technology. KTH is due to begin using the fastest academic supercomputer of any university in Scandinavia. A Cray XC30 with 1,676 nodes and a memory of 104.7 terabytes will be installed at KTH’s PDC Center for High Performance Computing.
IBM announced that Caris Life Sciences is using IBM technical computing and storage technology to accelerate the company’s molecular profiling services for cancer patients. The Caris tumor profiling database is one of the largest datasets in the application of advanced molecular profiling technologies to support clinicians in delivering personalized treatment recommendations — or precision oncology.
With President Obama announcing climate-support initiatives at the 2014 United Nations Climate Summit, the U.S. Department of Energy national laboratories are teaming with academia and the private sector to develop the most advanced climate and Earth system computer model yet created. For Los Alamos National Laboratory researchers, it is a welcome advance for an already vibrant high-performance computing community.
A computer model that accurately predicts how composite materials behave when damaged will make it easier to design lighter, more fuel-efficient aircraft. Innovative computer codes form the basis of a computer model that shows in unprecedented detail how an aircraft's composite wing, for instance, would behave if it suffered small-scale damage, such as a bird strike.
Governor Jindal and LSU President and Chancellor Alexander announced creation of the LSU Transformational Technology and Cyber Research Center, which will pursue major federal and commercial research projects in applied technology fields, leveraging the university’s unique strengths in such disciplines as supercomputing, cybersecurity and nanotechnology.
For centuries, scientific research has been about data, and as data in research continues to grow exponentially, so does the importance of how it’s stored. A key example of how the scientific field can tackle Big Data storage is DESY, a scientific research organization dedicated to providing scientists worldwide faster access to insights into samples, making optimal data management in a high-volume environment extremely critical.
- Page 1