Everyone has heard the old adage that time is money. In today’s society, business moves at the speed of making a phone call, looking something up online via your cell phone, or posting a tweet. So, when time is money (and can be a lot of money), why are businesses okay with waiting weeks or even months to get valuable information from their data?
SC15’s Visualization and Data Analytics Showcase Program will provide a forum for the year's...
Gamers might one day be able to enjoy the same graphics-intensive fast-action video games they...
Engineers have taken a step forward in creating the next generation of computers and mobile...
Moore’s Law recently turned 50 years old, and many have used the milestone to tout its virtues, highlight positive results that stem from it, as well as advance suggestions on what the future dividends will be and boldly project the date for its inevitable demise. Moore’s Law is an observation that has undoubtedly inspired us to innovate to the pace it predicts. It has challenged us to do so. Therefore, I think of it as Moore’s drumbeat.
Two experiments at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) in Geneva, Switzerland, have combined their results and observed a previously unseen subatomic process. A joint analysis by the CMS and LHCb collaborations has established a new and extremely rare decay of the Bs particle (a heavy composite particle consisting of a bottom antiquark and a strange quark) into two muons.
In the new era of quantum computers, many daily life applications, such as home banking, are doomed to failure, and new forms of ensuring the confidentiality of our data are being study to overcome this threat. Researchers have taken a step in this direction and propose a quantum blind signature scheme, which ensures that signatures cannot be copied and that the sender must compromise to a single message.
Extracting meaningful information out of clinical datasets can mean the difference between a successful diagnosis and a protracted illness. However, datasets can vary widely both in terms of the number of ‘features’ measured and number of independent observations taken. Now, researchers have developed an approach for targeted feature selection from datasets with small sample sizes, which tackles the so-called class imbalance problem.
A new initiative designed to advance how scientists digitally reconstruct and analyze individual neurons in the human brain will receive support from the supercomputing resources at the Department of Energy’s Oak Ridge National Laboratory (ORNL). Led by the Allen Institute for Brain Science, the BigNeuron project aims to create a common platform for analyzing the three-dimensional structure of neurons.
The ISC Cloud & Big Data Research Committee is accepting submissions until Tuesday, May 19, 2015. The Research Paper Sessions “aim to provide first-class open forums for engineers and scientists in academia, industry and government to present and discuss issues, trends and results to shape the future of cloud computing and big data.” The sessions will be held on Tuesday, September 29 and on Wednesday, September 30, 2015.
Each year, the global supercomputing community honors a handful of the leading contributors to the field with the presentation of the IEEE Seymour Cray Computer Science and Engineering Award, the IEEE Sidney Fernbach Memorial Award and the ACM-IEEE Ken Kennedy Award. Nominations for these awards to be presented at SC15 in Austin are now open and the submission deadline is Wednesday, July 1, 2015.
The German Climate Computing Center is managing the world's largest climate simulation data archive, used by climate researchers worldwide. The archive consists of more than 40 petabytes of data and is projected to grow by roughly 75 petabytes annually over the next five years. As climate simulations are carried out on increasingly powerful supercomputers, massive amounts of data are produced that must be effectively stored and analyzed.
Naoya Maruyama is a Team Leader at RIKEN Advanced Institute for Computational Science (RIKEN AICS), where he leads the HPC Programming Framework Research Team. He joined RIKEN AICS in 2012 after years at Tokyo Institute of Technology, where I received Ph.D. in Computer Science in 2008.
As a Senior Technical Consultant at NAG, Craig Lucas's key area is high performance computing. He also works on NAG's Numerical Libraries with emphasis on multi-core parallelism and numerical linear algebra. Craig joined NAG in 2008 to work on the HECToR service and is based in our Manchester Office. Before this he worked on CSAR, a previous national supercomputing service. He has also contributed software to the LAPACK library.
AMD provided details the company’s multi-year strategy to drive profitable growth based on delivering next-generation technologies powering a broad set of high-performance, differentiated products. Technology-related announcements included development of a brand new x86 processor core codenamed “Zen,” that will feature simultaneous multi-threading (SMT) for higher throughput and a new cache subsystem.
I-hsin Chung is a Research Staff Member at the Thomas J. Watson Research Center, IBM Research. He holds a Ph.D. in Computer Science from the University of Maryland, College Park.
François Bodin co-founded CAPS entreprise in 2002 while he was a Professor at University of Rennes I and, since January 2008, has joined the company as CTO. His contributions include new approaches for exploiting high performance processors in scientific computing and in embedded applications. Prior to joining CAPS, François Bodin held various research positions at the University of Rennes I and at the IRISA/INRIA research lab.
Dr. Pekka Lehtovuori holds a Ph.D. in Physical Chemistry. Currently he is working as a Director, Services for Research at CSC - the Finnish IT center for science. He was previously responsible for the coordination and operation of CSC's international and national grid-infrastructures (e.g. DEISA, EGEE, NDGF, and the Finnish national computing grid, M-grid).
Sonia Sachs is an expert in the design and delivery of cost-effective, high-performance technology solutions in support of large and complex scientific and commercial projects and programs with budget responsibilities up to $20 Million annually. She is skilled in all phases of the project life cycle, from initial feasibility analysis and conceptual design through implementation, maintenance and support.
James Dinan is a computer scientist specializing in parallel and high performance computer systems with a focus on parallel programming models, communication middleware, architecture, and scalable algorithms. My work seeks to enable better performance, new capabilities, and higher efficiency for scientific and engineering applications.
Wesley Bland is a postdoctoral appointee at Argonne National Laboratory in the Programming Models and Runtime Systems group led by Dr. Pavan Balaji. He graduated from the University of Tennessee, Knoxville in 2013 under the advisement of Dr. Jack Dongarra. His research interests include fault tolerance, parallel and distributed programming models, and runtime systems.
Rosa M. Badia holds a PhD from the UPC (1994). Before, she graduated on Computer Science at the Facultat d' Informàtica de Barcelona (UPC, 1989). She has been lecturing and doing research at the Computer Architecture Department (DAC) at the UPC from 1989 to 2008, where she held an Associate Professor position from 1997 to 2008 (my DAC homepage); she is currently part-time lecturing again at the same department. Currently she is a Scientific Researcher at the Spanish National Research Council (CSIC) . She is also the manager of the Grid computing and Clusters group at the Barcelona Supercomputing Center (BSC).
Sadaf Alam is Associate Director and Chief Architect at the Swiss National Supercomputing Centre - ETHZ. ISC'15 Tutorials Committee
Frank Hannig is Head of the Architecture and Compiler Design Group in the Department of Computer Science at Friedrich-Alexander University in Erlangen-Nürnberg, Germany.
Alvaro Aguilera is a Research Assistant at the Center for Information Services & High Performance Computing (ZIH), Technische Universität Dresden. Aguilera obtained his master's degree in computer science at the TU Dresden in 2011. His areas of interest include distributed le systems, storage solutions and performance analysis.
IBM Watson is collaborating with more than a dozen leading cancer institutes to accelerate the ability of clinicians to identify and personalize treatment options for their patients. The institutes will apply Watson's advanced cognitive capabilities to reduce from weeks to minutes the ability to translate DNA insights, understand a person's genetic profile and gather relevant information from medical literature to personalize treatment.
Conventional silicon-based computing, which has advanced by leaps and bounds in recent decades, is pushing against its practical limits. DNA computing could help take the digital era to the next level. Scientists are now reporting progress toward that goal with the development of a novel DNA-based GPS.
Protecting the world from destruction by asteroids sounds like superhuman power, but NASA scientists work tirelessly to ensure that humans today are protected from this potential harm. Asteroids need to be hunted in order to identify which ones may endanger Earth, and analyzing the big data puzzle of asteroid detection has been an arduous process. That is, until the power of crowdsourcing was discovered.
- Page 1