In the latest issue of HPC Source, “A New Dawn: Bringing HPC to the Enterprise,” we look at how small- to-medium-sized manufacturers can realize major benefits from adoption of high performance computing in areas such as modeling, simulation and analysis.
Silicon Mechanics has announced that it is launching the 4th Annual Research Cluster...
The Huawei FusionServer X6800 is a next-generation data center server optimized to support...
Iceotope's PetaGen liquid cooling system is designed for High Performance Computing (HPC) and suitable for Supercomputing and data center facilities. The system uses minimal energy resources, has zero dependence on clean water, streamlines needlessly bloated and expensive data center facilities, and improves IT performance.
High-performance computing can help a business to become more efficient and more productive. And, for a small business, HPC can be a game changer, helping it leapfrog ahead of the competition by reducing its costs and dramatically improving its time to market.
SGI has recently upgraded NASA's flagship supercomputer, Pleiades, with new, more powerful hardware in order to help meet the agency's increasing demand for high-end computing (HEC) resources.
An interview with PNNL’s Karol Kowalski, Capability Lead for NWChem Development - NWChem is an open source high performance computational chemistry tool developed for the Department of Energy at Pacific Northwest National Lab in Richland, WA. I recently visited with Karol Kowalski, Capability Lead for NWChem Development, who works in the Environmental Molecular Sciences Laboratory (EMSL) at PNNL.
International Data Corporation (IDC) has announced the eighth round of recipients of the HPC Innovation Excellence Award at the SC'14 high performance computing (HPC) conference in New Orleans, Louisiana. The HPC Innovation Excellence Awards recognize noteworthy achievements by users of high performance computing technologies.
HPC Innovation Excellence Award: Argonne National Laboratory, NRG (Netherlands), SCK-CEN (Belgium), TerraPower, and the University of Illinois at Urbana-ChampaignNovember 17, 2014 6:49 pm | Award Winners
Researchers from Argonne National Laboratory and the University of Illinois at Urbana-Champaign teamed with nuclear reactor designers and research laboratories in the United States and Europe to enable high-fidelity, cost-saving simulations to design the next-generation of nuclear reactors using the computational fluid dynamics code Nek5000.
The Center for Pediatric Genomic Medicine at Children's Mercy Hospitals Kansas City was the first genome center in the world to be created inside a children's hospital and one of the first to focus on genome sequencing and analysis for inherited childhood diseases.
For the US Army, and DoD and intelligence community as a whole, GIS Federal developed an innovative approach to quickly filter, analyze, and visualize big data from hundreds of data providers with a particular emphasis on geospatial data.
Researchers from NCSU conducted innovative research that will allow better prediction of thermal hydraulic behavior for current and future nuclear reactor designs.
Nexio simulation is a French SME located in Toulouse and specialized in electromagnetic simulation software for marine, space, defense and aeronautics domains applications.
The noise generated by civil air transport adversely impacts population centers near major airports. With the expected growth in air travel, community exposure to aircraft noise will increase considerably. To alleviate this problem, the Environmentally Responsible Aviation (ERA) project within NASA's Aeronautics Mission Research Directorate is working to simultaneously reduce aircraft noise, fuel consumption, and engine emissions.
Researchers used HPC resources were utilized to run and visualize a breakthrough simulation involving a long-track EF5 tornado embedded within a supercell.
PayPal engineers developed a platform for real‐time event analytics using HPC designs on new hardware technology.
HPC Innovation Excellence Award: The University of Texas MD Anderson Cancer Center, Texas Advance Computing Center (TACC) and Elekta ABNovember 17, 2014 5:46 pm | Award Winners
Researchers at MD Anderson Cancer Center in collaboration with TACC and Elekta AB are using detailed Monte Carlo computer simulations of radiation transport to assist in the development of the next generation of radiation therapy cancer treatments, which use a magnetic resonance imaging (MRI) scanner integrated with a radiation therapy unit (MRI-linac unit).
Researchers at Ohio State University Cancer Comprehensive Care Center developed and implemented bioinformatics and molecular methods to understand what happens to human papillomavirus (HPV) DNA in the "end game" of HPV-positive human cancers.
To heat magnetically confined plasmas to the millions of degrees needed for fusion reactions, scientists inject megawatts of electromagnetic energy from carefully engineered radiofrequency antennas. The generated electromagnetic waves interact with the plasma in complex ways.
Cray Sonexion 2000 system combines expertise in designing, scaling and managing end-to-end Lustre solutions with a unique architecture that allows for maximum scalability.
The U.S. Department of Energy has awarded IBM contracts valued at over $300 million to develop and deliver the world’s most advanced “data centric” supercomputing systems at Lawrence Livermore and Oak Ridge National Laboratories to advance innovation and discovery in science, engineering and national security.
It has been a commonly held belief that supercomputing capability is a predictable phenomenon with the "fastest" system in the world increasing in power by three orders of magnitude about every 11 years. I put the term "fastest" in quotes, because very few ask the question: Fastest in what way? It turns out that this notion of "fastest" is limited to a narrow consideration of system performance that focuses on floating point capability.
U.S. Secretary of Energy Ernest Moniz announced two new high performance computing (HPC) awards to put the nation on a fast-track to next generation exascale computing, which will help to advance U.S. leadership in scientific research and promote America’s economic and national security.
A decade of close scrutiny has shed much more light on the technical computing needs of small and medium enterprises (SMEs), but they are still shrouded in partial darkness. That’s hardly surprising for a diverse global group with millions of members ranging from automotive suppliers and shotgun genomics labs to corner newsstands and strip mall nail salons.
Welcome to SCIENTIFIC COMPUTING's "Bringing HPC to Smaller Manufacturers" edition of HPC Source, an interactive publication devoted exclusively to coverage of high performance computing.
Folk wisdom can sometimes be right on target. For example, there’s that old bromide about leading a horse to water. In this case, the water is high performance computing, and the reluctant equine is the huge base of small- to medium-sized manufacturers in the U.S. According to the National Center for Manufacturing Sciences, there are approximately 300,000 manufacturers in the U.S. Over 95 percent of them can be characterized as SMMs.
To help moderate the energy needs of increasingly power-hungry supercomputers, researchers at Sandia National Laboratories have released an application programming interface (API) with the goal of standardizing measurement and control of power- and energy-relevant features for HPC systems. The High Performance Computing — Power API specification, still open to collaborators for future development and is vendor-neutral.
In our November issue, Don Johnston looks at how “HPC Matters to our Quality of Life and Prosperity” and at how, through the HPC Impact Showcase, SC14 aims to underscore just how far-reaching high performance computing’s influence has become. Our cover story takes a look at several examples of how supercomputing capabilities are now being applied to problems that help businesses be more competitive and improve the quality of daily life.
- Page 1