Advertisement
HPC
Subscribe to HPC

The Lead

Deriving Real Time Value from Big Data

May 22, 2015 9:51 am | by Pat McGarry, Ryft Systems | Blogs | Comments

Everyone has heard the old adage that time is money. In today’s society, business moves at the speed of making a phone call, looking something up online via your cell phone, or posting a tweet. So, when time is money (and can be a lot of money), why are businesses okay with waiting weeks or even months to get valuable information from their data?

SC15 Scientific Visualization Showcase Submissions due July 31

May 21, 2015 2:53 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

SC15’s Visualization and Data Analytics Showcase Program will provide a forum for the year's...

Playing Graphics-intensive Fast-Action Games in the Cloud without Guzzling Gigabytes

May 21, 2015 9:50 am | by Duke University | News | Comments

Gamers might one day be able to enjoy the same graphics-intensive fast-action video games they...

Next-gen Computing: Closing in on Speeds Millions of Times Faster than Current Machines

May 19, 2015 4:49 pm | by University of Utah | News | Comments

Engineers have taken a step forward in creating the next generation of computers and mobile...

View Sample

FREE Email Newsletter

James Reinders is chief evangelist, Intel’s software products.

Software and Moore’s Drumbeat (Moore’s Law)

May 19, 2015 2:48 pm | by James Reinders, Intel | Blogs | Comments

Moore’s Law recently turned 50 years old, and many have used the milestone to tout its virtues, highlight positive results that stem from it, as well as advance suggestions on what the future dividends will be and boldly project the date for its inevitable demise. Moore’s Law is an observation that has undoubtedly inspired us to innovate to the pace it predicts. It has challenged us to do so. Therefore, I think of it as Moore’s drumbeat.

Event display from the LHCb experiments on the Large Hadron Collider show examples of collisions that produced candidates for the rare decay of the Bs particle, predicted and observed to occur only about four times out of a billion. Courtesy of LHCb colla

Two Large Hadron Collider Experiments First to Observe Rare Subatomic Process

May 18, 2015 11:22 am | by Fermi National Accelerator Laboratory | News | Comments

Two experiments at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) in Geneva, Switzerland, have combined their results and observed a previously unseen subatomic process. A joint analysis by the CMS and LHCb collaborations has established a new and extremely rare decay of the Bs particle (a heavy composite particle consisting of a bottom antiquark and a strange quark) into two muons.

Researchers show how to build a digital blind signature scheme under the assumption that they have an offline repository and are using quantum information.

Blind Signatures Using Offline Repositories Provide New Level of Security

May 15, 2015 3:35 pm | by World Scientific | News | Comments

In the new era of quantum computers, many daily life applications, such as home banking, are doomed to failure, and new forms of ensuring the confidentiality of our data are being study to overcome this threat. Researchers have taken a step in this direction and propose a quantum blind signature scheme, which ensures that signatures cannot be copied and that the sender must compromise to a single message.

Advertisement
Emphasizing the less common classes in datasets leads to improved accuracy in feature selection.

Counterintuitive Approach Yields Big Benefits for High-dimensional, Small-sized Problems

May 15, 2015 3:04 pm | by Agency for Science, Technology and Research (A*STAR) | News | Comments

Extracting meaningful information out of clinical datasets can mean the difference between a successful diagnosis and a protracted illness. However, datasets can vary widely both in terms of the number of ‘features’ measured and number of independent observations taken. Now, researchers have developed an approach for targeted feature selection from datasets with small sample sizes, which tackles the so-called class imbalance problem.

BigNeuron, a new project led by the Allen Institute for Brain Science, aims to streamline scientist’s ability to create 3-D digital models of neurons. Courtesy of Allen Institute for Brain Science

Digitizing Neurons: Project will convert 2-D Microscope Images into 3-D Models

May 14, 2015 9:46 am | by Oak Ridge National Laboratory | News | Comments

A new initiative designed to advance how scientists digitally reconstruct and analyze individual neurons in the human brain will receive support from the supercomputing resources at the Department of Energy’s Oak Ridge National Laboratory (ORNL). Led by the Allen Institute for Brain Science, the BigNeuron project aims to create a common platform for analyzing the three-dimensional structure of neurons.

The ISC Cloud & Big Data Research Committee is accepting submissions of high-quality papers in theoretical, experimental, industrial research and development until Tuesday, May 19, 2015.

Last Chance to Submit ISC Cloud & Big Data Research Papers

May 13, 2015 12:05 pm | by Suzanne Tracy, Editor-in-Chief, Scientific Computing and HPC Source | News | Comments

The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19, 2015. The Research Paper Sessions “aim to provide first-class open forums for engineers and scientists in academia, industry and government to present and discuss issues, trends and results to shape the future of cloud computing and big data.” The sessions will be held on Tuesday, September 29 and on Wednesday, September 30, 2015.

Established in late 1997. A crystal memento, illuminated certificate, and $10,000 honorarium are awarded to recognize innovative contributions to high performance computing systems that best exemplify the creative spirit demonstrated by Seymour Cray.

Nominations for Three SC15 Awards due July 1

May 8, 2015 9:52 am | by SC15 | News | Comments

Each year, the global supercomputing community honors a handful of the leading contributors to the field with the presentation of the IEEE Seymour Cray Computer Science and Engineering Award, the IEEE Sidney Fernbach Memorial Award and the ACM-IEEE Ken Kennedy Award. Nominations for these awards to be presented at SC15 in Austin are now open and the submission deadline is Wednesday, July 1, 2015.

The German Climate Computing Center is managing the world's largest climate simulation data archive, used by leading climate researchers worldwide. The archive currently consists of more than 40 petabytes of data and is projected to grow by roughly 75 pet

Managing the World's Largest Trove of Climate Data

May 8, 2015 9:10 am | by IBM | News | Comments

The German Climate Computing Center is managing the world's largest climate simulation data archive, used by climate researchers worldwide. The archive consists of more than 40 petabytes of data and is projected to grow by roughly 75 petabytes annually over the next five years. As climate simulations are carried out on increasingly powerful supercomputers, massive amounts of data are produced that must be effectively stored and analyzed.

Advertisement
Naoya Maruyama RIKEN AICS

Naoya Maruyama

May 7, 2015 3:44 pm | Biographies

Naoya Maruyama is a Team Leader at RIKEN Advanced Institute for Computational Science (RIKEN AICS), where he leads the HPC Programming Framework Research Team. He joined RIKEN AICS in 2012 after years at Tokyo Institute of Technology, where I received Ph.D. in Computer Science in 2008.

Craig Lucas, Senior Technical Consultant, NAG, UK

Craig Lucas

May 7, 2015 3:37 pm | Biographies

As a Senior Technical Consultant at NAG, Craig Lucas's key area is high performance computing. He also works on NAG's Numerical Libraries with emphasis on multi-core parallelism and numerical linear algebra. Craig joined NAG in 2008 to work on the HECToR service and is based in our Manchester Office. Before this he worked on CSAR, a previous national supercomputing service. He has also contributed software to the LAPACK library.

Technology-related announcements included:      Development of a brand new x86 processor core codenamed “Zen,” expected to drive AMD’s re-entry into high-performance desktop and server markets through improved instructions per clock of up to 40 percent, c

AMD Announces “Zen” x86 Processor Core

May 7, 2015 12:11 pm | by AMD | News | Comments

AMD provided details the company’s multi-year strategy to drive profitable growth based on delivering next-generation technologies powering a broad set of high-performance, differentiated products. Technology-related announcements included development of a brand new x86 processor core codenamed “Zen,” that will feature simultaneous multi-threading (SMT) for higher throughput and a new cache subsystem.

I-hsin Chung, IBM Research

I-hsin Chung

May 6, 2015 4:57 pm | Biographies

I-hsin Chung is a Research Staff Member at the Thomas J. Watson Research Center, IBM Research. He holds a Ph.D. in Computer Science from the University of Maryland, College Park.

Francois Bodin, Irisa

Francois Bodin

May 6, 2015 4:47 pm | Biographies

François Bodin co-founded CAPS entreprise in 2002 while he was a Professor at University of Rennes I and, since January 2008, has joined the company as CTO. His contributions include new approaches for exploiting high performance processors in scientific computing and in embedded applications. Prior to joining CAPS, François Bodin held various research positions at the University of Rennes I and at the IRISA/INRIA research lab.

Advertisement
Pekka Lehtovuori, Center for Scientific Computing, Finland

Pekka Lehtovuori

May 6, 2015 4:01 pm | Biographies

Dr. Pekka Lehtovuori holds a Ph.D. in Physical Chemistry. Currently he is working as a Director, Services for Research at CSC - the Finnish IT center for science. He was previously responsible for the coordination and operation of CSC's international and national grid-infrastructures (e.g. DEISA, EGEE, NDGF, and the Finnish national computing grid, M-grid).

Fred Johnson

May 6, 2015 3:19 pm | Biographies
Sonia Sachs, US Department of Energy

Sonia Sachs

May 6, 2015 3:18 pm | Biographies

Sonia Sachs is an expert in the design and delivery of cost-effective, high-performance technology solutions in support of large and complex scientific and commercial projects and programs with budget responsibilities up to $20 Million annually. She is skilled in all phases of the project life cycle, from initial feasibility analysis and conceptual design through implementation, maintenance and support.

James Dinan, Intel

James Dinan

May 6, 2015 3:01 pm | Biographies

James Dinan is a computer scientist specializing in parallel and high performance computer systems with a focus on parallel programming models, communication middleware, architecture, and scalable algorithms. My work seeks to enable better performance, new capabilities, and higher efficiency for scientific and engineering applications.

Wesley Bland, Argonne National Laboratory

Wesley Bland

May 6, 2015 2:56 pm | Biographies

Wesley Bland is a postdoctoral appointee at Argonne National Laboratory in the Programming Models and Runtime Systems group led by Dr. Pavan Balaji. He graduated from the University of Tennessee, Knoxville in 2013 under the advisement of Dr. Jack Dongarra. His research interests include fault tolerance, parallel and distributed programming models, and runtime systems.

Rosa M. Badia, Barcelona Supercomputing Center

Rosa M. Badia

May 6, 2015 2:49 pm | Biographies

Rosa M. Badia holds a PhD from the UPC (1994). Before, she graduated on Computer Science at the Facultat d' Informàtica de Barcelona (UPC, 1989). She has been lecturing and doing research at the Computer Architecture Department (DAC) at the UPC from 1989 to 2008, where she held an Associate Professor position from 1997 to 2008 (my DAC homepage); she is currently part-time lecturing again at the same department. Currently she is a Scientific Researcher at the Spanish National Research Council (CSIC) . She is also the manager of the Grid computing and Clusters group at the Barcelona Supercomputing Center (BSC).

Sadaf Alam

May 6, 2015 2:41 pm | Biographies

Sadaf Alam is Associate Director and Chief Architect at the Swiss National Supercomputing Centre - ETHZ. ISC'15 Tutorials Committee

Frank Hannig is Head of the Architecture and Compiler Design Group in the Department of Computer Science at Friedrich-Alexander University in Erlangen-Nürnberg, Germany.

Frank Hannig

May 6, 2015 2:14 pm | Biographies

Frank Hannig is Head of the Architecture and Compiler Design Group in the Department of Computer Science at Friedrich-Alexander University in Erlangen-Nürnberg, Germany.

Alvaro Aguilera

May 6, 2015 1:59 pm | Biographies

Alvaro Aguilera is a Research Assistant at the Center for Information Services & High Performance Computing (ZIH), Technische Universität Dresden. Aguilera obtained his master's degree in computer science at the TU Dresden in 2011. His areas of interest include distributed le systems, storage solutions and performance analysis.

The new program builds on IBM Research advancements in analytics and existing Watson collaborations to develop a genome data analysis solution for clinicians. Partners involved in the program will use Watson Genomic Analytics, a new solution specifically

14 Leading Cancer Institutes Collaborate to Advance Personalized Medicine for Cancer Patients

May 6, 2015 12:33 pm | by IBM | News | Comments

IBM Watson is collaborating with more than a dozen leading cancer institutes to accelerate the ability of clinicians to identify and personalize treatment options for their patients. The institutes will apply Watson's advanced cognitive capabilities to reduce from weeks to minutes the ability to translate DNA insights, understand a person's genetic profile and gather relevant information from medical literature to personalize treatment.

Scientists have programmed DNA to calculate multiple GPS routes at the same time. Courtesy of the American Chemical Society

Next Step in DNA Computing: GPS Mapping?

May 6, 2015 12:23 pm | by American Chemical Society | News | Comments

Conventional silicon-based computing, which has advanced by leaps and bounds in recent decades, is pushing against its practical limits. DNA computing could help take the digital era to the next level. Scientists are now reporting progress toward that goal with the development of a novel DNA-based GPS.

Michael Morris is General Manager at Appirio.

How Crowdsourcing can Solve Even Interstellar Problems

May 5, 2015 2:16 pm | by Michael Morris, Appirio | Blogs | Comments

Protecting the world from destruction by asteroids sounds like superhuman power, but NASA scientists work tirelessly to ensure that humans today are protected from this potential harm. Asteroids need to be hunted in order to identify which ones may endanger Earth, and analyzing the big data puzzle of asteroid detection has been an arduous process. That is, until the power of crowdsourcing was discovered.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading