Publication:

Recurrent competitive networks can learn locally excitatory topologies

Date

Date

Date
2012
Conference or Workshop Item
Published version
cris.lastimport.scopus2025-07-24T03:41:40Z
dc.contributor.institutionInstitute of Neuroinformatics
dc.date.accessioned2013-02-28T07:22:15Z
dc.date.available2013-02-28T07:22:15Z
dc.date.issued2012-06-15
dc.description.abstract

A common form of neural network consists of spatially arranged neurons, with weighted connections between the units providing both local excitation and long-range or global inhibition. Such networks, known as soft-winner-take-all networks or lateral-inhibition type neural fields, have been shown to exhibit desirable information-processing properties including balancing the influence of compatible inputs, deciding between incompatible inputs, signal restoration from noisy, weak, or overly strong input, and the ability to be used as trainable building blocks in larger networks. However, the local excitatory connections in such a network are typically hand-wired based on a fixed spatial arrangement which is chosen using prior knowledge of the dimensionality of the data to be learned by such a network, and neuroanatomical evidence is stubbornly inconsistent with these wiring schemes. Here we present a learning rule that allows networks with completely random internal connectivity to learn the weighted connections necessary for implementing the “local” excitation used by these networks, where the locality is with respect to the inherent topology of the input received by the network, rather than being based on an arbitrarily prescribed spatial arrangement of the cells in the network. We use the Siegert approximation to leaky integrate-and-fire neurons, obtaining networks with consistently sparse activity, to which we apply standard Hebbian learning with weight normalization, plus homeostatic activity regulation to ensure full network utilization. Our results show that such networks learn appropriate excitatory connections from the input, and do not require these connections to be hand-wired with a fixed topology as they traditionally have been for decades.

dc.identifier.doi10.1109/IJCNN.2012.6252786
dc.identifier.isbn978-1-4673-1489-3
dc.identifier.issn2161-4393
dc.identifier.scopus2-s2.0-84865073476
dc.identifier.urihttps://www.zora.uzh.ch/handle/20.500.14742/89877
dc.language.isoeng
dc.subject.ddc570 Life sciences; biology
dc.title

Recurrent competitive networks can learn locally excitatory topologies

dc.typeconference_item
dcterms.accessRightsinfo:eu-repo/semantics/openAccess
dcterms.bibliographicCitation.journaltitleProceedings of the International Joint Conference on Neural Networks
dcterms.bibliographicCitation.originalpublishernameIEEE
dcterms.bibliographicCitation.originalpublisherplaceBrisbane, Australia
dcterms.bibliographicCitation.pageend8
dcterms.bibliographicCitation.pagestart1
dspace.entity.typePublicationen
oairecerif.event.countryAustralia
oairecerif.event.endDate2012-06-15
oairecerif.event.placeBrisbane
oairecerif.event.startDate2012-06-10
uzh.contributor.affiliationETH Zürich
uzh.contributor.affiliationUniversity of Zurich
uzh.contributor.affiliationETH Zürich
uzh.contributor.authorJug, Florian
uzh.contributor.authorCook, Matthew
uzh.contributor.authorSteger, Angelika
uzh.contributor.correspondenceYes
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.document.availabilitypostprint
uzh.eprint.datestamp2013-02-28 07:22:15
uzh.eprint.lastmod2022-01-24 00:22:17
uzh.eprint.statusChange2013-02-28 07:22:15
uzh.event.presentationTypespeech
uzh.event.titleIEEE International Joint Conference on Neural Networks (IJCNN) 2012
uzh.event.typeconference
uzh.harvester.ethYes
uzh.harvester.nbNo
uzh.identifier.doi10.5167/uzh-75316
uzh.jdb.eprintsId31739
uzh.note.public© 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
uzh.oastatus.unpaywallgreen
uzh.oastatus.zoraGreen
uzh.publication.citationJug, Florian; Cook, Matthew; Steger, Angelika (2012). Recurrent competitive networks can learn locally excitatory topologies. In: IEEE International Joint Conference on Neural Networks (IJCNN) 2012, Brisbane, Australia, 10 June 2012 - 15 June 2012. IEEE, 1-8.
uzh.publication.facultyscience
uzh.publication.originalworkoriginal
uzh.publication.pageNumber8
uzh.publication.publishedStatusfinal
uzh.publication.seriesTitleProceedings of the International Joint Conference on Neural Networks
uzh.scopus.impact6
uzh.scopus.subjectsSoftware
uzh.scopus.subjectsArtificial Intelligence
uzh.workflow.doajuzh.workflow.doaj.false
uzh.workflow.eprintid75316
uzh.workflow.fulltextStatuspublic
uzh.workflow.revisions39
uzh.workflow.rightsCheckkeininfo
uzh.workflow.statusarchive
Files

Original bundle

Name:
2012_TopoLearning.pdf
Size:
1.31 MB
Format:
Adobe Portable Document Format
Publication available in collections: