Header

UZH-Logo

Maintenance Infos

Structural plasticity denoises responses and improves learning speed


Spiess, Robin; George, Richard; Cook, Matthew; Diehl, Peter U (2016). Structural plasticity denoises responses and improves learning speed. Frontiers in Computational Neuroscience:10:98.

Abstract

Despite an abundance of computational models for learning of synaptic weights, there has been relatively little research on structural plasticity, i.e., the creation and elimination of synapses. Especially, it is not clear how structural plasticity works in concert with spike-timing-dependent plasticity (STDP) and what advantages their combination offers. Here we present a fairly large-scale functional model that uses leaky integrate-and-fire neurons, STDP, homeostasis, recurrent connections, and structural plasticity to learn the input encoding, the relation between inputs, and to infer missing inputs. Using this model, we compare the error and the amount of noise in the network's responses with and without structural plasticity and the influence of structural plasticity on the learning speed of the network. Using structural plasticity during learning shows good results for learning the representation of input values, i.e., structural plasticity strongly reduces the noise of the response by preventing spikes with a high error. For inferring missing inputs we see similar results, with responses having less noise if the network was trained using structural plasticity. Additionally, using structural plasticity with pruning significantly decreased the time to learn weights suitable for inference. Presumably, this is due to the clearer signal containing less spikes that misrepresent the desired value. Therefore, this work shows that structural plasticity is not only able to improve upon the performance using STDP without structural plasticity but also speeds up learning. Additionally, it addresses the practical problem of limited resources for connectivity that is not only apparent in the mammalian neocortex but also in computer hardware or neuromorphic (brain-inspired) hardware by efficiently pruning synapses without losing performance.

Abstract

Despite an abundance of computational models for learning of synaptic weights, there has been relatively little research on structural plasticity, i.e., the creation and elimination of synapses. Especially, it is not clear how structural plasticity works in concert with spike-timing-dependent plasticity (STDP) and what advantages their combination offers. Here we present a fairly large-scale functional model that uses leaky integrate-and-fire neurons, STDP, homeostasis, recurrent connections, and structural plasticity to learn the input encoding, the relation between inputs, and to infer missing inputs. Using this model, we compare the error and the amount of noise in the network's responses with and without structural plasticity and the influence of structural plasticity on the learning speed of the network. Using structural plasticity during learning shows good results for learning the representation of input values, i.e., structural plasticity strongly reduces the noise of the response by preventing spikes with a high error. For inferring missing inputs we see similar results, with responses having less noise if the network was trained using structural plasticity. Additionally, using structural plasticity with pruning significantly decreased the time to learn weights suitable for inference. Presumably, this is due to the clearer signal containing less spikes that misrepresent the desired value. Therefore, this work shows that structural plasticity is not only able to improve upon the performance using STDP without structural plasticity but also speeds up learning. Additionally, it addresses the practical problem of limited resources for connectivity that is not only apparent in the mammalian neocortex but also in computer hardware or neuromorphic (brain-inspired) hardware by efficiently pruning synapses without losing performance.

Statistics

Citations

Dimensions.ai Metrics
7 citations in Web of Science®
6 citations in Scopus®
2 citations in Microsoft Academic
Google Scholar™

Altmetrics

Downloads

36 downloads since deposited on 26 Jan 2017
11 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Scopus Subject Areas:Life Sciences > Neuroscience (miscellaneous)
Life Sciences > Cellular and Molecular Neuroscience
Language:English
Date:2016
Deposited On:26 Jan 2017 12:46
Last Modified:01 Apr 2020 22:18
Publisher:Frontiers Research Foundation
Series Name:Frontiers in Computational Neuroscience
ISSN:1662-5188
OA Status:Gold
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.3389/fncom.2016.00093
Project Information:
  • : FunderFP7
  • : Grant ID612058
  • : Project TitleRAMP - Real neurons-nanoelectronics Architecture with Memristive Plasticity

Download

Gold Open Access

Download PDF  'Structural plasticity denoises responses and improves learning speed'.
Preview
Content: Published Version
Filetype: PDF
Size: 10MB
View at publisher
Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)