Advertisement
HPC
Subscribe to HPC

The Lead

The Hans Meuer Award has been created in memory of the late Dr. Hans Meuer, general chair of the ISC conference from 1986 through 2014, and co-founder of the TOP500 project.

ISC Introduces the Hans Meuer Award

February 27, 2015 2:53 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

ISC is introducing the Hans Meuer Award to honor the most outstanding research paper submitted to the ISC High Performance conference’s research paper committee. This annual award has been created in memory of the late Dr. Hans Meuer, general chair of the ISC conference from 1986 through 2014, and co-founder of the TOP500 project.

Developing Simulation Software to Combat Humanity’s Biggest Issues

February 25, 2015 12:36 pm | by Queen’s University Belfast | News | Comments

Researchers are creating ground-breaking computer software, which has the potential to develop...

PNNL Shifts Computational Chemistry into Overdrive

February 25, 2015 8:29 am | by Karol Kowalski, Ph.D., and Edoardo Apra, Ph.D. | Articles | Comments

We computational chemists are an impatient lot. Despite the fact that we routinely deal with...

Powering a New Era of Deep Learning

February 20, 2015 12:42 pm | by Stephen Jones, NVIDIA | Blogs | Comments

GPU-accelerated applications have become ubiquitous in scientific supercomputing. Now, we are...

View Sample

FREE Email Newsletter

The OpenPOWER Foundation has announced a solid lineup of speakers headlining its inaugural OpenPOWER Summit at NVIDIA’s GPU Technology Conference at the San Jose Convention Center, March 17-19, 2015. Drawing from the open development organization’s more t

OpenPOWER Announces “Rethink the Data Center” Speaker Lineup

February 20, 2015 11:26 am | by OpenPOWER Foundation | News | Comments

The OpenPOWER Foundation has announced a solid lineup of speakers headlining its inaugural OpenPOWER Summit at NVIDIA’s GPU Technology Conference at the San Jose Convention Center, March 17-19, 2015. Drawing from the open development organization’s more than 100 members worldwide, the Summit’s organizers have lined up over 35 member presentations tied to the event’s “Rethink the Data Center” theme.

Another myth is that scientists look like this. U.S. Army RDECOM/Flickr, CC BY-SA

Seven Myths about Scientists Debunked

February 19, 2015 2:07 pm | by Jeffrey Craig and Marguerite Evans-Galea, Murdoch Childrens Research Institute | Articles | Comments

As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.

Daniel Sanchez, Nathan Beckmann and Po-An Tsai have found that the ways in which a chip carves up computations can make a big difference to performance. -- Courtesy of Bryce Vickmark

Making Smarter, Much Faster Multicore Chips

February 19, 2015 2:02 pm | by Larry Hardesty, MIT | News | Comments

Computer chips’ clocks have stopped getting faster. To keep delivering performance improvements, chipmakers are instead giving chips more processing units, or cores, which can execute computations in parallel. But the ways in which a chip carves up computations can make a big difference to performance.

Advertisement
The first in its series, ISC Cloud & Big Data will highlight the synergies between cloud and big data and present ways these technologies can build on each other’s strengths.

Inaugural ISC Cloud & Big Data Conference to be Held in Frankfurt

February 19, 2015 10:36 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The inaugural international ISC Cloud & Big Data conference is a three-day multiple-track event that is replacing the ISC Cloud and ISC Big Data conferences, which were held separately over the past five years. Taking place from September 28 to 30, 2015, the conference will be held in Frankfurt, Germany, at the Frankfurt Marriott Hotel.

Tim Cutts is Head of Scientific Computing at Wellcome Trust Sanger Institute

Modern DNA Sequencing Requires a Modern Day Approach

February 13, 2015 2:27 pm | by Tim Cutts, Wellcome Trust Sanger Institute | Blogs | Comments

The sequencing machines that run today produce data several orders of magnitude faster than the machines used in the Human Genome Project. We at the Wellcome Trust Sanger Institute currently produce more sequences in one hour than we did in our first 10 years of operation. A great deal of computational resource is then needed to process that data.

Qian and colleagues found that the topological phases in the TMDC materials can be turned on and off by simply applying a vertical electric field that is perpendicular to the atomic plane of the material. That's shown here in calculations by the red cross

Exotic States Materialize with Supercomputers

February 13, 2015 11:26 am | by Jorge Salazar, Texas Advanced Computing Center | News | Comments

Scientists used supercomputers to find a new class of materials that possess an exotic state of matter known as the quantum spin Hall effect. The researchers published their results in the journal Science in December 2014, where they propose a new type of transistor made from these materials.

In this real-time, non-stop, 48-hour challenge, teams of undergraduate and/or high school students will assemble a small cluster on the SC15 exhibit floor and race to demonstrate the greatest sustained performance across a series of applications.

SC15 to Strengthen, Enhance Programs for Students, Early Career Researchers

February 13, 2015 10:32 am | by SC15 | News | Comments

For the past 15 years, the annual SC conference has welcomed hundreds of students to the week-long conference held every November, providing an entry into the community of high performance computing and networking. For SC15 in Austin, the student programs will be coordinated as a broader program to recruit a diverse group of students, ranging from undergrads to graduate students, as well as researchers in the early stages of their careers

Founded in 1999, D-Wave Systems describes itself as “the first commercial quantum computing company.”

Analog Quantum Computers: Still Wishful Thinking?

February 12, 2015 2:24 pm | by European Physical Journal (EPJ) | News | Comments

Many challenges lie ahead before quantum annealing, the analog version of quantum computation, contributes to solve combinatorial optimization problems. Traditional computational tools are simply not powerful enough to solve some complex optimization problems, like, for example, protein folding.

Advertisement
Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source

Helping to Save Lives of Critically Ill Children

February 12, 2015 10:17 am | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Articles | Comments

For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.

Supported through crowdfunding, researchers have concluded a successful experiment to identify a novel genetic mutation as the source of a specific rare disease.

Crowdfunding Helps Solve Rare Disease Mystery

February 11, 2015 1:47 pm | by Tel Aviv University | News | Comments

Rare diseases — those that affect fewer than one in 200,000 people — are often identified early in life. Some 30 percent of children afflicted by these "orphan diseases" do not live to see their fifth birthday. While the US Orphan Drug Act of 1983 was written into law to promote research on the topic, the cost of identifying the source and progression of these diseases remains prohibitive for many families.

Rob Farber is an independent HPC expert to startups and Fortune 100 companies, as well as government and academic organizations.

Using Profile Information for Optimization, Energy Savings and Procurements

February 9, 2015 12:11 pm | by Rob Farber | Articles | Comments

Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution

Snow and icy conditions affect human decisions about transportation. These decisions can ripple through other infrastructure systems, causing widespread disruptions. Shown here are points of connectivity. Courtesy of Paul M. Torrens and Cheng Fu, Universi

Big Data Techniques More Accurately Model People in a Winter Wonderland

February 6, 2015 2:53 pm | by Cecile J. Gonzalez, NSF | News | Comments

For Paul Torrens, wintry weather is less about sledding and more about testing out models of human behavior. Torrens, a geographer at the University of Maryland, studies how snow and icy conditions affect human decisions about transportation. He also studies how these decisions ripple through other infrastructure systems.

Brain Researcher Marianne Fyhn receives computation help from, among others, Gaute Einevoll and Anders Malthe-Sørenssen to acquire an understanding of how the brain Works.

Mathematics to Reveal Secrets of the Brain

February 5, 2015 4:33 pm | by Yngve Vogt, University of Oslo | News | Comments

Top researchers are using mathematical modelling and heavy computations to understand how the brain can both remember and learn. Ten years ago, when the team of Marianne Fyhn and Torkel Hafting Fyhn cooperated with the Nobel Prize winning team of May-Britt and Edvard Moser at NTNU, they discovered the sense of orientation in the brain.

Advertisement
Microscopic image of senile plaques seen in the cerebral cortex of a person with Alzheimer's disease of presenile onset. Courtesy of KGH

Blue Waters Project helps Uncover Alzheimer's Complex Genetic Networks

February 5, 2015 4:06 pm | by NSF | News | Comments

The release of the film, Still Alice, in September 2014 placed a much-needed light on Alzheimer's disease, a debilitating neurological disease that affects a growing number of Americans each year. More than 5.2 million people in the U.S. are currently living with Alzheimer's. One out of nine Americans over 65 has Alzheimer's, and one out of three over 85 has the disease. For those over 65, it is the fifth leading cause of death.

In the United States, big data environments are utilizing advanced computing systems to map phenotype to underlying process and to compare those who develop disease with those who don’t. To accomplish this, the researchers are assembling publically availa

Reversing the Global Diabesity Epidemic

February 5, 2015 2:38 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | Blogs | Comments

Diabesity has been identified as a major global health problem by researchers and healthcare professionals world-wide, including England’s National Health Service, Brigham and Women’s Hospital and Harvard Medical School, Ain Shams University Hospital in Cairo, Egypt, and a research consortium of the European Union.

ISC High Performance is open to engineers, IT specialists, systems developers, vendors, end users, scientists, researchers, students and other members of the HPC global community.

Submission Period Ending Soon for ISC High Performance 2015

February 4, 2015 9:29 am | by ISC | News | Comments

In less than two weeks, most of the ISC High Performance submission opportunities will come to an end, and thus the organizers urge you to act now. The workshops, tutorials, birds-of-a-feather (BoF) sessions, research posters sessions are still open for submission until February 15. The student volunteer program application ends April 10.

When Northwestern University professor Luis Amaral set out to test LDA, he found that it was neither as accurate nor reproducible as a leading topic modeling algorithm should be.

Taking a Network Approach to Building Trustworthy Big Data Algorithms

February 2, 2015 1:08 pm | by Emily Ayshford, Northwestern University | News | Comments

Much of our reams of data sits in large databases of unstructured text. Finding insights among e-mails, text documents and Web sites is extremely difficult, unless we can search, characterize and classify their text data in a meaningful way. A leading big data algorithm for finding related topics within unstructured text is LDA. But Luis Amaral found that it was neither as accurate nor reproducible as a leading topic modeling algorithm ...

Customized Treatment: "Charting the versions of the genes that are only found in cancer cells may help tailor the treatment offered to each patient," says Rolf Skotheim. Courtesy of Yngve Vogt

Supercomputing Reveals Genetic Code of Cancer

February 2, 2015 12:46 pm | by Yngve Vogt, University of Oslo | News | Comments

Cancer researchers must use one of the world's fastest computers to detect which versions of genes are only found in cancer cells. Every form of cancer, even every tumor, has its own distinct variants. A research group is working to identify the genes that cause bowel and prostate cancer, which are both common diseases. There are 4,000 new cases of bowel cancer in Norway every year. Only six out of 10 patients survive the first five years.

In simulations, algorithms using the new data structure continued to demonstrate performance improvement with the addition of new cores, up to a total of 80 cores. Courtesy of Christine Daniloff/MIT

Parallelizing Common Algorithms: Priority Queue Implemention Keeps Pace with New Cores

January 30, 2015 3:49 pm | by Larry Hardesty, MIT News Office | News | Comments

Every undergraduate computer-science major takes a course on data structures, which describes different ways of organizing data in a computer’s memory. Every data structure has its own advantages: Some are good for fast retrieval, some for efficient search, some for quick insertions and deletions, and so on.

The Alan Turing Institute will promote the development and use of advanced mathematics, computer science, algorithms and big data for human benefit.

Alan Turing Institute Positioned to Break New Big Data, Online Security Boundaries

January 30, 2015 11:41 am | by Engineering and Physical Sciences Research Council | News | Comments

The five universities have been selected to lead the new Alan Turing Institute. The Institute will build on the UK's existing academic strengths and help position the country as a world leader in the analysis and application of big data and algorithm research. Its headquarters will be based at the British Library at the center of London’s Knowledge Quarter.

“In nanomedicine we need to understand physical phenomena on a nano scale, forming as correct a picture as possible of molecular phenomena. In this context, quantum chemical calculations are important,” says Michele Cascella. Courtesy of Hanne Utigard

Quantum Chemistry Closing in on Quantum Mechanics of Living Cells

January 30, 2015 11:19 am | by Yngve Vogt, University of Oslo | News | Comments

Quantum chemical calculations have been used to solve big mysteries in space. Soon the same calculations may be used to produce tomorrow’s cancer drugs. Quantum chemical calculations are needed to explain what happens to the electrons’ trajectories within a molecule, and the results of a quantum chemical calculation are often more accurate than what is achievable experimentally.

As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have s

Quantum Computer’s Extremely Precise Measurements Show Space is Not Squeezed

January 29, 2015 3:19 pm | by Robert Sanders, UC Berkeley | News | Comments

Ever since Einstein proposed his special theory of relativity, physics and cosmology have been based on the assumption that space looks the same in all directions — that it’s not squeezed in one direction relative to another. A new experiment used partially entangled atoms — identical to the qubits in a quantum computer — to demonstrate more precisely than ever before that this is true, to one part in a billion billion.

Improving Data Mobility and Management for International Cosmology

Improving Data Mobility and Management for International Cosmology Workshop

January 28, 2015 3:06 pm | by Lawrence Berkeley National Laboratory | Events

Registration is now open for a workshop on “Improving Data Mobility and Management for International Cosmology” to be held February 10-11, 2015, at Lawrence Berkeley National Laboratory in California. The workshop, one in a series of Cross-Connects workshops, is sponsored the by the Deptartment of Energy’s ESnet and Internet2. Early registration is encouraged, as attendance is limited.

Arabidopsis thaliana, a model flowering plant studied by biologists, has climate-sensitive genes whose expression was found to evolve. Courtesy of Penn State

Needle in a Haystack: Finding the Right Genes in Tens of Thousands

January 28, 2015 2:45 pm | by TACC | News | Comments

Scientists using supercomputers found genes sensitive to cold and drought in a plant help it survive climate change. The computational challenges were daunting, involving thousands of individual strains of the plant with hundreds of thousands of markers across the genome and testing for a dozen environmental variables. Their findings increase basic understanding of plant adaptation and can be applied to improve crops.

This simulation, which models a rheometer with particles, can help determine how well a rheometer design works at characterizing a fluid. The NIST team is performing a number of simulations like this one, varying the shape and number of blades to better i

Predicting Concrete Flow Properties from Simple Measurements

January 23, 2015 2:44 pm | by NIST | News | Comments

Just because concrete is the most widely used building material in human history doesn’t mean it can’t be improved. A recent study using DOE Office of Science supercomputers has led to a new way to predict concrete’s flow properties from simple measurements. The results should help accelerate the design of a new generation of high-performance and eco-friendly cement-based materials by reducing time and costs associated with R&D.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading