Spiking Neural Networks (SNN) represent the third generation of artificial neural networks. SNN models are built using both spatial and temporal aspects of the input data, and as such advance a step closer to true brain-inspired processing. SNNs have shown great promise and potential in low-power sensory-processing and edge computing hardware platforms.
In the paper An Error-Propagation Spiking Neural Network Compatible With Neuromorphic Processors, researchers from ETH Zurich leverage existing spike-based learning circuits to propose a biologically plausible architecture that is highly successful in classifying distinct complex spatio-temporal spike patterns.
The study advances the design and development of ultra-low-power mixed-signal neuromorphic processing systems capable of distinguishing spatio-temporal patterns in spiking activity produced for example by vision or auditory sensors. The researchers say the network hardware’s low power usage (the neurons only transmit information when input data reaches a spiking threshold) make SNNs good candidates for bio-signal processing and brain-machine interfaces.
The researchers first describe the network topology of a cortical model. Inspired by the functionality, connectivity, and diversity of cell types in the human neocortex, the cortical model design comprises both rate-based multicompartment excitatory neurons and inhibitory interneurons.
The multicompartment neuron is schematized as a three-compartment (Apical, Basal and Somatic) pyramidal neuron (P). This pyramidal neuron is used to integrate sensory bottom-up information and a top-down teaching signal, with its apical compartment receiving feedback from higher-order areas as well as from the interneurons.
During the neuron learning process, the interneurons are laterally driven by the pyramidal neurons, which learn to replicate the spiking activity of specific neurons in the above layer, for instance to cancel the top-down teacher signal in the apical compartment of the pyramidal neuron. Similarly, the firing rate of the readout neuron (R) will tend to the teacher signal (coming from the layer above with the lateral connectivity coming from the interneurons).
The researchers then adapt the model to enable it to be directly implemented with sub-threshold neuromorphic circuits by using equations and plasticity mechanisms. They design a learning rule for bottom-up connections, where incremental weight updates are based on the difference between somatic and basal membrane currents. The team explains that a hysteresis term in the equation gives rise to a stop-learning region, which enables lifelong learning.
Following the rate-based model proposed by J. Sacramento et al. , the team evaluated the proposed spiking network on spatio-temporal pattern recognition and discrimination tasks.
In the pattern recognition task, the difference in activity between a known Pattern 1 and a deviatory sensory input (Pattern 2) proved that the proposed method’s learning is specific. On the pattern discrimination task, the output neurons’ spiking activity correctly classified the input patterns.
The proposed architecture relies solely on weight updates triggered by local variables and parameters, and is thus suitable for implementation on mixed-signal analog/digital neuromorphic chips. The study advances the development of low power always-on learning chips that can be applied in edge computing, robotics and distributed computation applications.
The paper An Error-Propagation Spiking Neural Network Compatible With Neuromorphic Processors is on arXiv.
Author: Hecate He | Editor: Michael Sarazen
We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.