The drug creation process often misses many side effects that kill at least 100,000 patients a year. LLNL researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions, using high-performance computers to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.
What do the DNA in Australian seaweed, Amazon River water, tropical plants, and forest soil all...
The purpose of this series is to discuss the impact of GMP (Good Manufacturing Practice)...
The HPC Advisory Council and ISC High Performance call on undergraduate students from around the...
IBM will pay $1.5 billion to Globalfoundries in order to shed its costly chip division. IBM Director of Research John E. Kelly III said in an interview on October 20, 2104, that handing over control of the semiconductor operations will allow it to grow faster, while IBM continues to invest in and expand its chip research.
The DOE’s Energy Sciences Network, or ESnet, is deploying four new high-speed transatlantic links, giving researchers at America’s national laboratories and universities ultra-fast access to scientific data from the Large Hadron Collider and other research sites in Europe. ESnet’s transatlantic extension will deliver a total capacity of 340 gigabits-per-second, and serve dozens of scientific collaborations.
When ESnet engineers began to study the idea of building a new 100 Gbps network between the US and Europe, a primary concern was ensuring the service would be robust and built from multiple underlying links — so that if one went down, researchers could still rely on sufficient bandwidth. Based on data collected by Caltech physicist and networking pioneer Harvey Newman, the team understood multiple cables are sometimes cut simultaneously.
The Jefferson Project announced new milestones in a multimillion-dollar collaboration that seeks to understand and manage complex factors impacting Lake George. A new data visualization laboratory features advanced computing and graphics systems that allow researchers to visualize sophisticated models and incoming data on weather, runoff and circulation patterns. The lab will display streaming data from various sensors in real-time.
Computer chips with superconducting circuits — circuits with zero electrical resistance — would be 50 to 100 times as energy-efficient as today’s chips, an attractive trait given the increasing power consumption of the massive data centers that power the Internet’s most popular sites. Superconducting chips also promise greater processing power.
SGI UV for SAP HANA is a purpose-built, in-memory computing appliance for growing environments running the SAP HANA platform. SAP-certified and available as a 4- or 8-socket single-node system with up to 6 TBs of in-memory computing, the appliance is designed to enable the largest enterprises to achieve real-time operations and business breakthroughs with SAP HANA at extreme scale
The Cray Urika-XA System is an open platform for high-performance big data analytics, pre-integrated with the Apache Hadoop and Apache Spark frameworks. It is designed to provide users with the benefits of a turnkey analytics appliance combined with a flexible, open platform that can be modified for future analytics workloads.
NVIDIA is looking for a dozen would-be competitors for next year’s Early Stage Challenge, which takes place as part of its Emerging Companies Summit (ECS). In this seventh annual contest, hot young startups using GPUs vie for a single $100,000 grand prize.
SX1400, SX1700 and SX1710 Ethernet switches are top-of-rack Open Ethernet software-defined networking (SDN) 10/40/56 Gigabit switches. Based on Mellanox’s SwitchX-2 switch ICs, they provide enhanced control plane capabilities and allow for the design of hyperscale data center networks and control-intensive cloud applications.
In a keynote speech at IBM Enterprise, Jamie Thomas, General Manager, Storage and Software Defined Systems at IBM, unveiled a bold strategy for the company’s storage business. Building upon the Software Defined Storage portfolio announced last May, IBM is focusing its storage business on a new model for enterprise data storage that is optimized for interoperability across hardware and software solutions.
Two research teams have found distinct solutions to a critical challenge that has held back the realization of super powerful quantum computers. The teams, working in the same laboratories at UNSW Australia, created two types of quantum bits, or "qubits" — the building blocks for quantum computers — that each process quantum data with an accuracy above 99 percent.
From performing surgery to driving cars, today’s robots can do it all. With chatbots recently hailed as passing the Turing test, it appears robots are becoming increasingly adept at posing as humans. While machines are becoming ever more integrated into human lives, the need to imbue them with a sense of morality becomes increasingly urgent. But can we really teach robots how to be good? An innovative piece of research looks into the matter
The Oil and Gas High Performance Computing (HPC) Workshop, hosted annually at Rice University, is the premier meeting place for discussion of challenges and opportunities around high performance computing, information technology, and computational science and engineering.
Next-gen leaders push themselves every day to answer this key question: How can my organization make a difference? IBM is helping to deliver the answer with new apps powered by Watson to improve the quality of life. IBM's Watson is a groundbreaking platform with the ability to interact in natural language, process vast amounts of disparate forms of big data and learn from each interaction.
High Performance Parallelism Pearls, the latest book by James Reinders and Jim Jeffers, is a teaching juggernaut that packs the experience of 69 authors into 28 chapters designed to get readers running on the Intel Xeon Phi family of coprocessors, plus provide tools and techniques to adapt legacy codes, as well as increase application performance on Intel Xeon processors.
As with the proverbial canary in the coal mine, birds serve as an indicator of the health of our environment. Many common species have experienced significant population declines within the last 40 years. Suggested causes include habitat loss and climate change, however to fully understand bird distribution relative to the environment, extensive data are needed.
More than 2.8 megaliters of water has been saved in just under a year using groundwater to cool the Pawsey Centre supercomputer in Perth.To make that happen, scientists have undertaken stringent tests to ensure that returning heated water to the Mullalloo aquifer has no adverse effects.
NASA has formally delivered to Alaskan officials a new technology that could help pilots flying over the vast wilderness expanses of the northern-most state. The technology is designed to help pilots make better flight decisions, especially when disconnected from the Internet, telephone, flight services and other data sources normally used by pilots.
Technology is putting complex topics like severe weather and climate change on the map — literally. Mapping data associated with specific geographic locations is a powerful way to glean new and improved knowledge from data collections and to explain the results to policymakers and the public. Particularly useful is the ability to layer different kinds of geospatial data on top of one another and see how they interact.
Supercomputing 2014 (SC14) is announcing a new “HPC Matters” plenary that will examine the important roles that high performance computing (HPC) plays in every aspect of society from simplifying manufacturing to tsunami warning systems and hurricane predictions to improving care for cancer patients.
The next time some nasty storms are heading your way, the National Weather Service says it will have a better forecast of just how close they could come to you. The weather service started using a new high-resolution computer model that officials say will dramatically improve forecasts for storms up to 15 hours in advance. It should better pinpoint where and when tornadoes, thunderstorms and blizzards are expected.
A little-known secret in data mining is that simply feeding raw data into a data analysis algorithm is unlikely to produce meaningful results. New discoveries often begin with comparison of data streams to find connections and spot outliers. But most data comparison algorithms today have one major weakness — somewhere, they rely on a human expert. But experts aren’t keeping pace with the complexities of big data.
IBM Watson Group's global headquarters, at 51 Astor Place in New York City's Silicon Alley, is open for business. The Watson headquarters will serve as a home base for more than 600 IBM Watson employees, just part of the more than 2,000 IBMers dedicated to Watson worldwide. In addition to a sizeable employee presence, IBM is opening its doors to area developers and entrepreneurs, hosting industry workshops, seminars and networking.
The IEEE Technology Time Machine (TTM) is going further into the future. Now in its third year, the annual two-day IEEE meeting is mixing things up a little in terms of format and topics. Rather than just looking at how some technologies might evolve in the next decade, experts and visionaries are going to look out to 2035 and beyond.
Building on client demand to integrate real-time analytics with consumer transactions, IBM has announced new capabilities for its System z mainframe. The integration of analytics with transactional data can provide businesses with real-time, actionable insights on commercial transactions as they occur to take advantage of new opportunities to increase sales and help minimize loss through fraud prevention.
- Page 1