ISC is introducing the Hans Meuer Award to honor the most outstanding research paper submitted to the ISC High Performance conference’s research paper committee. This annual award has been created in memory of the late Dr. Hans Meuer, general chair of the ISC conference from 1986 through 2014, and co-founder of the TOP500 project.
Researchers are creating ground-breaking computer software, which has the potential to develop...
We computational chemists are an impatient lot. Despite the fact that we routinely deal with...
The OpenPOWER Foundation has announced a solid lineup of speakers headlining its inaugural OpenPOWER Summit at NVIDIA’s GPU Technology Conference at the San Jose Convention Center, March 17-19, 2015. Drawing from the open development organization’s more than 100 members worldwide, the Summit’s organizers have lined up over 35 member presentations tied to the event’s “Rethink the Data Center” theme.
As scientific researchers, we are often surprised by some of the assumptions made about us by those outside our profession. So we put together a list of common myths we and our colleagues have heard anecdotally regarding scientific researchers.
Computer chips’ clocks have stopped getting faster. To keep delivering performance improvements, chipmakers are instead giving chips more processing units, or cores, which can execute computations in parallel. But the ways in which a chip carves up computations can make a big difference to performance.
The inaugural international ISC Cloud & Big Data conference is a three-day multiple-track event that is replacing the ISC Cloud and ISC Big Data conferences, which were held separately over the past five years. Taking place from September 28 to 30, 2015, the conference will be held in Frankfurt, Germany, at the Frankfurt Marriott Hotel.
The sequencing machines that run today produce data several orders of magnitude faster than the machines used in the Human Genome Project. We at the Wellcome Trust Sanger Institute currently produce more sequences in one hour than we did in our first 10 years of operation. A great deal of computational resource is then needed to process that data.
Scientists used supercomputers to find a new class of materials that possess an exotic state of matter known as the quantum spin Hall effect. The researchers published their results in the journal Science in December 2014, where they propose a new type of transistor made from these materials.
For the past 15 years, the annual SC conference has welcomed hundreds of students to the week-long conference held every November, providing an entry into the community of high performance computing and networking. For SC15 in Austin, the student programs will be coordinated as a broader program to recruit a diverse group of students, ranging from undergrads to graduate students, as well as researchers in the early stages of their careers
Many challenges lie ahead before quantum annealing, the analog version of quantum computation, contributes to solve combinatorial optimization problems. Traditional computational tools are simply not powerful enough to solve some complex optimization problems, like, for example, protein folding.
For those on the front lines of treating cancer, speed and precision are key to patients’ survival. Pediatric cancer researchers have been making incredible strides in accelerating delivery of new diagnostic and treatment options. Supercomputer-powered genetic diagnosis is being used to harness the power of high throughput genomic and proteomic methods and is playing a key role in improving the outcome for children with genetic diseases.
Rare diseases — those that affect fewer than one in 200,000 people — are often identified early in life. Some 30 percent of children afflicted by these "orphan diseases" do not live to see their fifth birthday. While the US Orphan Drug Act of 1983 was written into law to promote research on the topic, the cost of identifying the source and progression of these diseases remains prohibitive for many families.
Optimization for high-performance and energy efficiency is a necessary next step after verifying that an application works correctly. In the HPC world, profiling means collecting data from hundreds to potentially many thousands of compute nodes over the length of a run. In other words, profiling is a big-data task, but one where the rewards can be significant — including potentially saving megawatts of power or reducing the time to solution
For Paul Torrens, wintry weather is less about sledding and more about testing out models of human behavior. Torrens, a geographer at the University of Maryland, studies how snow and icy conditions affect human decisions about transportation. He also studies how these decisions ripple through other infrastructure systems.
Top researchers are using mathematical modelling and heavy computations to understand how the brain can both remember and learn. Ten years ago, when the team of Marianne Fyhn and Torkel Hafting Fyhn cooperated with the Nobel Prize winning team of May-Britt and Edvard Moser at NTNU, they discovered the sense of orientation in the brain.
The release of the film, Still Alice, in September 2014 placed a much-needed light on Alzheimer's disease, a debilitating neurological disease that affects a growing number of Americans each year. More than 5.2 million people in the U.S. are currently living with Alzheimer's. One out of nine Americans over 65 has Alzheimer's, and one out of three over 85 has the disease. For those over 65, it is the fifth leading cause of death.
Diabesity has been identified as a major global health problem by researchers and healthcare professionals world-wide, including England’s National Health Service, Brigham and Women’s Hospital and Harvard Medical School, Ain Shams University Hospital in Cairo, Egypt, and a research consortium of the European Union.
In less than two weeks, most of the ISC High Performance submission opportunities will come to an end, and thus the organizers urge you to act now. The workshops, tutorials, birds-of-a-feather (BoF) sessions, research posters sessions are still open for submission until February 15. The student volunteer program application ends April 10.
Much of our reams of data sits in large databases of unstructured text. Finding insights among e-mails, text documents and Web sites is extremely difficult, unless we can search, characterize and classify their text data in a meaningful way. A leading big data algorithm for finding related topics within unstructured text is LDA. But Luis Amaral found that it was neither as accurate nor reproducible as a leading topic modeling algorithm ...
Cancer researchers must use one of the world's fastest computers to detect which versions of genes are only found in cancer cells. Every form of cancer, even every tumor, has its own distinct variants. A research group is working to identify the genes that cause bowel and prostate cancer, which are both common diseases. There are 4,000 new cases of bowel cancer in Norway every year. Only six out of 10 patients survive the first five years.
Every undergraduate computer-science major takes a course on data structures, which describes different ways of organizing data in a computer’s memory. Every data structure has its own advantages: Some are good for fast retrieval, some for efficient search, some for quick insertions and deletions, and so on.
The five universities have been selected to lead the new Alan Turing Institute. The Institute will build on the UK's existing academic strengths and help position the country as a world leader in the analysis and application of big data and algorithm research. Its headquarters will be based at the British Library at the center of London’s Knowledge Quarter.
Quantum chemical calculations have been used to solve big mysteries in space. Soon the same calculations may be used to produce tomorrow’s cancer drugs. Quantum chemical calculations are needed to explain what happens to the electrons’ trajectories within a molecule, and the results of a quantum chemical calculation are often more accurate than what is achievable experimentally.
Ever since Einstein proposed his special theory of relativity, physics and cosmology have been based on the assumption that space looks the same in all directions — that it’s not squeezed in one direction relative to another. A new experiment used partially entangled atoms — identical to the qubits in a quantum computer — to demonstrate more precisely than ever before that this is true, to one part in a billion billion.
Registration is now open for a workshop on “Improving Data Mobility and Management for International Cosmology” to be held February 10-11, 2015, at Lawrence Berkeley National Laboratory in California. The workshop, one in a series of Cross-Connects workshops, is sponsored the by the Deptartment of Energy’s ESnet and Internet2. Early registration is encouraged, as attendance is limited.
Scientists using supercomputers found genes sensitive to cold and drought in a plant help it survive climate change. The computational challenges were daunting, involving thousands of individual strains of the plant with hundreds of thousands of markers across the genome and testing for a dozen environmental variables. Their findings increase basic understanding of plant adaptation and can be applied to improve crops.
Just because concrete is the most widely used building material in human history doesn’t mean it can’t be improved. A recent study using DOE Office of Science supercomputers has led to a new way to predict concrete’s flow properties from simple measurements. The results should help accelerate the design of a new generation of high-performance and eco-friendly cement-based materials by reducing time and costs associated with R&D.
- Page 1