Header

UZH-Logo

Maintenance Infos

Local structure helps learning optimized automata in recurrent neural networks


Binas, J; Indiveri, G; Pfeiffer, M (2015). Local structure helps learning optimized automata in recurrent neural networks. In: The International Joint Conference on Neural Networks (IJCNN) 2015, Killarney, Ireland, 11 July 2015 - 17 July 2015, 1-7.

Abstract

Deterministic behavior can be modeled conveniently in the framework of finite automata. We present a recurrent neural network model based on biologically plausible circuit motifs that can learn deterministic transition models from given input sequences. Furthermore, we introduce simple structural constraints on the connectivity that are inspired by biology. Simulation results show that this leads to great improvements in terms of training time, and efficient use of resources in the converged system. Previous work has shown how specific instances of finite-state machines (FSMs) can be synthesized in recurrent neural networks by interconnecting multiple soft winner-take-all (SWTA) circuits - small circuits that can faithfully reproduce many computational properties of cortical networks. We extend this framework with a reinforcement learning mechanism to learn correct state transitions as input and reward signals are provided. Not only does the network learn a model for the observed sequences, and encode it in the recurrent synaptic weights, it also finds solutions that are close-to-optimal in the number of states required to model the target system, leading to efficient scaling behavior as the size of the target problems increases.

Abstract

Deterministic behavior can be modeled conveniently in the framework of finite automata. We present a recurrent neural network model based on biologically plausible circuit motifs that can learn deterministic transition models from given input sequences. Furthermore, we introduce simple structural constraints on the connectivity that are inspired by biology. Simulation results show that this leads to great improvements in terms of training time, and efficient use of resources in the converged system. Previous work has shown how specific instances of finite-state machines (FSMs) can be synthesized in recurrent neural networks by interconnecting multiple soft winner-take-all (SWTA) circuits - small circuits that can faithfully reproduce many computational properties of cortical networks. We extend this framework with a reinforcement learning mechanism to learn correct state transitions as input and reward signals are provided. Not only does the network learn a model for the observed sequences, and encode it in the recurrent synaptic weights, it also finds solutions that are close-to-optimal in the number of states required to model the target system, leading to efficient scaling behavior as the size of the target problems increases.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

14 downloads since deposited on 19 Feb 2016
4 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Speech), not_refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:17 July 2015
Deposited On:19 Feb 2016 12:37
Last Modified:25 Jul 2018 04:39
Publisher:IEEE Xplore
OA Status:Green
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1109/IJCNN.2015.7280714
Related URLs:https://www.recherche-portal.ch/ZAD:default_scope:ZORA121690 (Library Catalogue)

Download

Download PDF  'Local structure helps learning optimized automata in recurrent neural networks'.
Preview
Content: Accepted Version
Filetype: PDF
Size: 258kB
View at publisher