Header

UZH-Logo

Maintenance Infos

Temporal Sequence Recognition in a Self-Organizing Recurrent Network


Ceolini, Enea; Neil, Daniel; Delbruck, Tobi; Liu, Shih-Chii (2016). Temporal Sequence Recognition in a Self-Organizing Recurrent Network. In: IEEE International Conference on Event-Based Control, Communication, and Signal Processing EBCCSP 2016, Krakow, Poland, 13 June 2016 - 15 June 2016.

Abstract

A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the connection weights within the network so that the network performance is optimal for the intended task of temporal sequence recognition. One particular RNN called the Self-Organizing Recurrent Network (SORN) avoids the mathematical normalization required after each initialization. Instead, three types of cortical plasticity mechanisms optimize the weights within the network during the initial part of the training. The success of this unsupervised training method was demonstrated on temporal sequences that use input symbols with a binary encoding and that activate only one input pool in each time step. This work extends the analysis towards different types of symbol encoding ranging from encoding methods that activate multiple input pools and that use encoding levels that are not strictly binary but analog in nature. Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained.

Abstract

A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the connection weights within the network so that the network performance is optimal for the intended task of temporal sequence recognition. One particular RNN called the Self-Organizing Recurrent Network (SORN) avoids the mathematical normalization required after each initialization. Instead, three types of cortical plasticity mechanisms optimize the weights within the network during the initial part of the training. The success of this unsupervised training method was demonstrated on temporal sequences that use input symbols with a binary encoding and that activate only one input pool in each time step. This work extends the analysis towards different types of symbol encoding ranging from encoding methods that activate multiple input pools and that use encoding levels that are not strictly binary but analog in nature. Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained.

Statistics

Altmetrics

Downloads

0 downloads since deposited on 27 Jan 2017
0 downloads since 12 months

Additional indexing

Item Type:Conference or Workshop Item (Speech), refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:15 June 2016
Deposited On:27 Jan 2017 11:18
Last Modified:29 Aug 2017 16:11
Publisher:Proceedings of 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP)
Series Name:IEEE Second International Conference on Event-Based Control, Communication and Signal Processing (EBCCSP)
Publisher DOI:https://doi.org/10.1109/EBCCSP.2016.7605258
Official URL:http://ieeexplore.ieee.org/document/7605258/

Download

Preview Icon on Download
Filetype: PDF - Registered users only
Size: 283kB
View at publisher