Advertisement
Industry News
Subscribe to Industry News

The Lead

Scientifically accurate 3-D heart model accelerates device testing and research for treatment of heart disease

Dassault Systèmes Announces Commercial Availability of Its First Simulated Human Heart

May 20, 2015 1:58 pm | by Dassault Systèmes | News | Comments

Dassault Systèmes announced that the first heart model from its “Living Heart Project” will be commercially available on May 29, 2015. Powered by Dassault Systèmes’ 3DEXPERIENCE platform’s realistic simulation applications, the commercial, high-fidelity scientifically validated 3-D simulator of a four-chamber human heart is the first product of its kind.

Last Chance to Submit ISC Cloud & Big Data Research Papers

May 13, 2015 12:05 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19,...

ISC Announces the Hans Meuer Award Winning Research Paper

May 11, 2015 8:35 am | by ISC | News | Comments

ISC has announced that a research paper in the area of in-memory architecture, jointly submitted...

Nominations for Three SC15 Awards due July 1

May 8, 2015 9:52 am | by SC15 | News | Comments

Each year, the global supercomputing community honors a handful of the leading contributors to...

View Sample

FREE Email Newsletter

The German Climate Computing Center is managing the world's largest climate simulation data archive, used by leading climate researchers worldwide. The archive currently consists of more than 40 petabytes of data and is projected to grow by roughly 75 pet

Managing the World's Largest Trove of Climate Data

May 8, 2015 9:10 am | by IBM | News | Comments

The German Climate Computing Center is managing the world's largest climate simulation data archive, used by climate researchers worldwide. The archive consists of more than 40 petabytes of data and is projected to grow by roughly 75 petabytes annually over the next five years. As climate simulations are carried out on increasingly powerful supercomputers, massive amounts of data are produced that must be effectively stored and analyzed.

Technology-related announcements included:      Development of a brand new x86 processor core codenamed “Zen,” expected to drive AMD’s re-entry into high-performance desktop and server markets through improved instructions per clock of up to 40 percent, c

AMD Announces “Zen” x86 Processor Core

May 7, 2015 12:11 pm | by AMD | News | Comments

AMD provided details the company’s multi-year strategy to drive profitable growth based on delivering next-generation technologies powering a broad set of high-performance, differentiated products. Technology-related announcements included development of a brand new x86 processor core codenamed “Zen,” that will feature simultaneous multi-threading (SMT) for higher throughput and a new cache subsystem.

A main goal of the Young Mind Awards program is to stimulate interest in engineering at an earlier age, and bring a greater awareness that a career in this field can be rewarding while solving real-world problems.

2015 Young Mind Awards Program Accepting Entries through May 31

May 7, 2015 8:15 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The Young Mind Awards, a global student competition that showcases design engineering and R&D excellence, is now accepting entries in five award categories. Designed to “challenge and inspire promising future engineers,” the program is open to middle and high school students, as well as undergraduates.

Advertisement
EMC has announced new enhancements across its Documentum portfolio of enterprise content management (ECM) applications.

EMC Enhances Documentum Enterprise Content Management Applications

May 4, 2015 12:42 pm | by EMC | News | Comments

As the emergence of social media, cloud and big data continues to fuel the digital evolution, today’s digital workplace must drive new levels of employee engagement, operational efficiency and service excellence. To help deliver on this digital transformation, EMC has announced new enhancements across its Documentum portfolio of ECM applications, enabling users to further address next-generation ECM.

Dr. Jan Vitt, Head of IT Infrastructure, DZ Bank

ISC Cloud & Big Data Keynote Will Recount Bank’s Path into Cloud Computing

April 30, 2015 4:17 pm | by ISC | News | Comments

As the fourth largest cooperative bank in Germany, DZ Bank supports the business activities of over 900 other cooperative banks in the country. Dr. Jan Vitt, the Head of IT Infrastructure at DZ Bank will be talking about how a conservative institution like his is effectively adopting cloud computing to address the IT needs of their various business divisions.

IBM researcher Jerry Chow in the quantum computing lab at IBM's T.J. Watson Research Center. Courtesy of Jon Simon/Feature Photo Service for IBM

Scientists Achieve Critical Steps to Building First Practical Quantum Computer

April 29, 2015 9:21 am | by IBM | News | Comments

IBM scientists unveiled two critical advances towards the realization of a practical quantum computer. For the first time, they showed the ability to detect and measure both kinds of quantum errors simultaneously, as well as demonstrated a new, square quantum bit circuit design that is the only physical architecture that could successfully scale to larger dimensions.

Water, CO2 and green power - these are the ingredients for Audi e-diesel.

Audi Succeeds in Making Diesel Fuel from Carbon Dioxide and Water

April 28, 2015 10:03 am | by Audi | News | Comments

Audi has taken another big step in the development of new, CO2 neutral fuels: A pilot plant in Dresden has started production of the synthetic fuel Audi e diesel. After a commissioning phase of just four months, the research facility in Dresden started producing its first batches of high‑quality diesel fuel this month. The only raw materials needed are water and carbon dioxide.

The option of doing predictive analytics via the cloud gives security teams the flexibility to bring in skills, innovation and information on demand across all of their security environments.

Bringing Cyber Threat Predictive Analytics to The Cloud

April 27, 2015 9:51 am | by IBM | News | Comments

IBM is bringing its Security Intelligence technology, IBM QRadar, to the cloud, giving companies the ability to quickly prioritize real threats and free up critical resources to fight cyberattacks. The new services are available through a cloud-based software as a service (SaaS) model, with optional IBM Security Managed Services to provide deeper expertise and flexibility for security professionals.

Advertisement
Enterprise AI deployments will also drive additional spending on IT hardware and services including computing power, graphics processor units (GPUs), networking products, storage and cloud computing.

AI for Enterprise Applications to Reach $11.1 Billion, Deep Learning will be Breakout Technology

April 24, 2015 2:39 pm | by Tractica | News | Comments

After 60 years of false starts, the integration of artificial intelligence with probability and statistics has led to a marriage of machine learning, control theory and neuroscience that is yielding practical benefits. This shared theoretical foundation, combined with the exponential growth of processing power and the unprecedented increase in the amount of data available to analyze, has made AI systems attractive for businesses to adopt.

The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

Cray XC40 will be First Supercomputer in Berkeley Lab’s New Computational Research and Theory Facility

April 23, 2015 3:17 pm | by NERSC and Berkeley Lab | News | Comments

The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray announced they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.

The winners of the CyberCenturion National Finals Competition, King Edward VI Grammar School, Chelmsford, with their coach pictured in front of Collossus at The National Museum of Computing, Bletchley Park. Their awards were presented April 17 by Andrew T

UK CyberCenturion Competition Launches in Search for Young Cyber Security Talent

April 22, 2015 2:43 pm | by Northrop Grumman | News | Comments

Northrop Grumman has renewed its commitment to run the CyberCenturion competition for a second year, continuing its efforts to seek out the UK's best young cyber talent. CyberCenturion is the UK's first team-based cyber security contest specifically designed to attract 12- to 18-year-olds. The competition aims to engage young people with an interest in cyber as a way to address the STEM skills gap and encourage careers in cyber security.

The new Cray XC40 supercomputer and Sonexion storage system, Powered by Seagate, will provide PGS with the advanced computational capabilities necessary to run highly complex seismic processing and imaging applications. These applications include imaging

Seagate Storage Technology Powering Four Cray Advanced HPC Implementations

April 21, 2015 11:49 am | by Seagate | News | Comments

Seagate Technology has announced that four Cray customers will be among the first to implement Seagate’s latest high performance computing storage technology. Combined, the implementations of these four customers in the government, weather, oil and gas, and university sectors will consume more than 120 petabytes of storage capacity.

INCITE seeks research proposals for capability computing: production simulations — including ensembles — that use a large fraction of Leadership Computing Facility systems or require the unique LCF architectural infrastructure for HPC projects that cannot

INCITE Seeking Proposals to Advance Science and Engineering at U.S. Leadership Computing Facilities

April 20, 2015 10:07 am | by Jeff Gary, OLCF | News | Comments

The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is now accepting proposals for high-impact, computationally intensive research campaigns in a broad array of science, engineering and computer science domains.

Advertisement
The Challenge is designed to promote integration of transversal skills useful for the development of processing algorithms specifically optimized to maximize the processing capabilities of the latest graphics boards.

GPU4EO Challenge 2015: Stimulating Adoption of GPUs in Remote Sensing

April 17, 2015 3:11 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

GPU4EO Challenge 2015 is an international initiative which involves students, researchers and professionals in a challenge aimed at improving the performance of remotely sensed data processing using the capacity of GPUs. Teams are asked to use and process Earth observation satellite data with NVIDIA k40 GPU and DORIS, an open source software package, to obtain the best performance, as determined by the fastest processing time.

ISC High Performance has extended the deadline to apply for the ISC Student Volunteer Program. ISC will provide out-of-town students with accommodation, as well as most meals.

ISC High Performance Issues Urgent Call for Student Volunteers

April 16, 2015 8:46 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

ISC High Performance has extended the deadline to apply for the ISC Student Volunteer Program. The new deadline is April 30, 2015. More volunteers are needed this year, as the conference will be hosting a larger number of sessions than in previous years, and the student volunteer program is critical in helping to run the conference as smoothly as possible.

The tape path used for data read back in the world record tape demo. On the right, you can see a tape head that writes the data, tape moves to the left, and then on the left, you can see a dimple where the HDD head is reading back the data written (the mi

IBM Research Sets New Record for Tape Storage

April 10, 2015 9:50 am | by IBM | News | Comments

IBM scientists have demonstrated an areal recording density of 123 billion bits of uncompressed data per square inch on low-cost, particulate magnetic tape, a breakthrough which represents the equivalent of a 220 terabyte tape cartridge that could fit in the palm of your hand.

Argonne’s decision to utilize Intel’s HPC scalable system framework stems from the fact it is designed to deliver a well-balanced and adaptable system capable of supporting both compute-intensive and data-intensive workloads

Intel to Deliver Nation’s Most Powerful Supercomputer at Argonne

April 9, 2015 2:07 pm | by Intel | News | Comments

Intel has announced that the U.S. Department of Energy’s (DOE) Argonne Leadership Computing Facility (ALCF) has awarded Intel Federal LLC, a wholly-owned subsidiary of Intel Corporation, a contract to deliver two next-generation supercomputers to Argonne National Laboratory.

The future of tropical rainforests in the Amazon and worldwide is the focus of a new research project that combines field experiments and predictive modeling.

Study Combines Field Experiments, Predictive Modeling to Look at How Forests Worldwide Respond to Climate Change

April 7, 2015 5:09 pm | by Oak Ridge National Laboratory | News | Comments

Researchers from the Department of Energy’s Oak Ridge National Laboratory will play key roles in an expansive new project that aims to bring the future of tropical forests and the climate system into much clearer focus by coupling field research with the development of a new ecosystem model.

During the LHC's second run, particles will collide at a staggering 13 teraelectronvolts (TeV), which is 60 percent higher than any accelerator has achieved before.

U.S. Scientists Celebrate Restart of Large Hadron Collider

April 6, 2015 3:46 pm | by Oak Ridge National Laboratory | News | Comments

The world's most powerful particle accelerator began its second act on April 5. After two years of upgrades and repairs, proton beams once again circulated around the Large Hadron Collider, located at the CERN laboratory near Geneva. With the collider back in action, the more than 1,700 U.S. scientists are prepared to join thousands of their international colleagues to study the highest-energy particle collisions ever achieved.

The current Pangea supercomputer is a 2.3 petaflop system based on the Intel Xeon E5-2670 v3 processor that consists of 110,592 cores and contains 442 terabytes of memory built on SGI ICE X, one of the world's fastest commercial distributed memory superco

Total Partners with SGI to Upgrade its Pangea Supercomputer

April 1, 2015 11:27 am | by SGI | News | Comments

Total has chosen SGI to upgrade its supercomputer Pangea. Total is one of the largest integrated oil and gas companies in the world, with activities in more than 130 countries. Its 100,000 employees put their expertise to work in every part of the industry — the exploration and production of oil and natural gas, refining, chemicals, marketing and new energies. This updated system would place in the top 10 of the latest TOP500 list.

In New York City, Manju Malkani, IBM analytics consultant, and Paul Walsh, Vice President of Weather Analytics at The Weather Company, access real-time weather data through IBM Watson Analytics.

The Weather Company Migrates Data Services to IBM Cloud, Plans to Advance Internet of Things Solutions

March 31, 2015 1:43 pm | by IBM | News | Comments

IBM and The Weather Company have announced a global strategic alliance to integrate real-time weather insights into business to improve operational performance and decision-making. As part of the alliance, The Weather Company, including its global B2B division WSI, will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.

The 2015 Ethernet Roadmap provides practical guidance to the development of Ethernet, and offers an in-depth look at Ethernet’s accelerating evolution and expansion in four key areas: consumer and residential; enterprise and campus; hyperscale data center

Ethernet Alliance Unveils Detailed Roadmap

March 25, 2015 1:57 pm | by Ethernet Alliance | News | Comments

The Ethernet Alliance, a global consortium dedicated to the continued success and advancement of Ethernet technologies, has released the 2015 Ethernet Roadmap. The first-ever publicly available industry roadmap will outline the ongoing development and evolution of Ethernet through the end of the decade. Ethernet, the world’s most widely adopted networking technology, saw a period of rapid change and diversification in 2014.

Michael Stonebraker invented many of the concepts that are used in almost all modern database systems. Courtesy of Dcoetzee

“Nobel Prize in Computing” goes to MIT Database Systems Architecture Pioneer

March 25, 2015 1:44 pm | by Association for Computing Machinery | News | Comments

The Association for Computing Machinery has named Michael Stonebraker of MIT recipient of the 2014 ACM A.M. Turing Award for fundamental contributions to the concepts and practices underlying modern database systems. Database systems are critical applications of computing and preserve much of the world's important data. Stonebraker invented many of the concepts that are used in almost all modern database systems.

The OpenPOWER Foundation which is a collaboration of technologists encouraging the adoption of an open server architecture for computer data centers has grown to more than 110 businesses, organizations and individuals across 22 countries.

10 New OpenPOWER Foundation Solutions Unveiled

March 19, 2015 3:19 pm | by OpenPOWER Foundation | News | Comments

The OpenPOWER Foundation has announced more than 10 hardware solutions — spanning systems, boards and cards, and a new microprocessor customized for China. Built collaboratively by OpenPOWER members, the new solutions exploit the POWER architecture to provide more choice, customization and performance to customers, including hyperscale data centers. 

Pascal will offer better performance than Maxwell on key deep-learning tasks.

NVIDIA’s Next-Gen Pascal GPU Architecture to Provide 10X Speedup for Deep Learning Apps

March 18, 2015 12:24 pm | News | Comments

NVIDIA has announced that its Pascal GPU architecture, set to debut next year, will accelerate deep learning applications 10X beyond the speed of its current-generation Maxwell processors. NVIDIA CEO and co-founder Jen-Hsun Huang revealed details of Pascal and the company’s updated processor roadmap in front of a crowd of 4,000 during his keynote address at the GPU Technology Conference, in Silicon Valley.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading