Brain-derived Computing beyond Von Neumann
Karlheinz Meier, professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics, will deliver a keynote talk at the International Supercomputing Conference 2014 (ISC’14). The theme for this talk will be ‘Brain-derived computing beyond Von Neumann – achievements and challenges’. Meier is one of the co-directors of Europe’s Human Brain Project (HBP), where he will be leading a research group in neuromorphic computing. Funded by the European Commission, HBP is an ambitious 10-year, €1.19-billion project, with the intention of greatly advancing the understanding of the human brain using cutting-edge computer technologies.
In light of his upcoming talk, Nages Sieslack speaks to Meier about the emerging field of neuromorphic computing and asks what he hopes to accomplish through HBP…
What is your specific involvement in HBP?
HBP builds on previous scientific work. In the past, I have initiated and led two large European projects, which were very similar to HBP, although on a smaller scale. Those projects were FACETS and BrainScaleS. Together with the Swiss BlueBrain project, they form the foundation of HBP.
In HBP I am one of the three executive directors. The work is shared between Henry Markram, Richard Frackowiak, and myself. Specifically, I am also in charge of the neuromorphic computing sub-project.
By training you are a particle physicist... how did you get into brain-inspired computing?
I spent more than 30 years of my life working on many experiments that collected the knowledge for what we today call the standard model of particle physics. Particle physics relies on very elaborate data processing systems, often constructed as custom hardware set-ups.
For the ATLAS experiment at CERN, I designed and built a large-scale, mixed-signal electronic data-processing system that analyzes information from 8,000 channels 40 million times per second. To achieve this, I founded a laboratory for microelectronics at Heidelberg University, Germany, back in 1994.
At some point, I learned about electronic models of brain cells and realized that it must be possible to build physical copies of these models in VLSI electronics. We started with small student projects and now the research has reached a very exciting scale. I decided to give up my particle physics research and dedicate my research to these new, brain-inspired computers. I strongly believe that there is large potential in this work for both fundamental research and applications.
What is neuromorphic computing and how is it different from conventional computing? How do you program such a chip?
Neuromorphic computers are systems with the same massive parallelism as the brain and the same functions on the microscopic and the macroscopic level. Communication between cells is carried out by stereotypic action potentials that propagate through the network asynchronously and in continuous time. Neuromorphic systems are very power-efficient, fault-tolerant, and compact. Most importantly, they can self-organize based on their input data.
In biology, this self-organization is called learning or development, based on the timescales and the mechanisms involved. Learning and development happen on timescales from milliseconds to years, but supercomputers used to simulate biological networks typically work 100 to 1,000 times more slowly. Neuromorphic systems are not programmed, but they use their ability to learn to reconfigure themselves based on input data.
What types of scientific disciplines need to be brought together to design and build a neuromorphic computer?
As neuromorphic computing is based on biological brains, a project like HBP relies on the coherent collaboration of neurobiologists, theoretical neuroscientists, mathematicians, physicists, and engineers. Making these groups speak the same language is the key challenge for projects like HBP.
Is there a preliminary model for such technology that your group will be using for the HBP work?
Yes, HBP will follow two complementary approaches for building neuromorphic computing systems. There are working prototypes for both approaches and the scaling concept is well defined. By the end of the 30-month ramp-up phase, HBP plans to operate a physical model system with 4 million neurons and 1 billion synapses in Heidelberg, and a system of 0.5 million ARM cores in Manchester, UK.
During the 10-year lifetime of the project, what do you expect to accomplish?
The goal of HBP is to build and operate six technology platforms that aggregate neuroscience data, and use them for brain simulations on an exascale computer. From these simulations we will seek to derive very-large-scale neuromorphic computing systems. We do not make predictions about the scientific outcomes of the simulation experiments, but we promise to build collaborative tools that will enable very exciting science.
ISC'14 will be held in Leipzig from 22-26 June, 2014.
Nages Sieslack is PR manager for ISC events. This article originally appeared in iSGTW on February 12, 2014.