A model of how the brain may learn new things has been successfully created on a computer system that was itself designed on brain-inspired principles.
The model was developed by an international team of computer scientists and made use of the SpiNNaker system, developed by Professor Steve Furber at The University of Manchester. The research examined the trade-offs in efficiency and accuracy involved in developing an event-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule for SpiNNaker. This rule was then used in a recurrent attractor network model to learned temporal sequences of neural activity; the model was simulated at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses - the largest plastic neural network ever to be simulated on neuromorphic hardware.
A comparable simulation was run on a Cray XC-30 supercomputer system, and to match the run-time of the SpiNNaker simulation the supercomputer system needed approximately 45 times more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models. It is anticipated that the group’s latest research will provide a valuable resource to aid future developments in the fields of neuroscience, robotics and computer science.
- Advanced Processor Technologies Group, School of Computer Science, University of Manchester
- Department of Computational Biology, Royal Institute of Technology, Stockholm, Sweden
- Stockholm Brain Institute, Karolinska Institute, Stockholm, Sweden
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh
- Department of Visualization and Data Analysis, Zuse Institute Berlin, Germany
- Department of Numerical Analysis and Computer Science, Stockholm University, Sweden