Publication: Recurrent competitive networks can learn locally excitatory topologies
Recurrent competitive networks can learn locally excitatory topologies
Date
Date
Date
| cris.lastimport.scopus | 2025-07-24T03:41:40Z | |
| dc.contributor.institution | Institute of Neuroinformatics | |
| dc.date.accessioned | 2013-02-28T07:22:15Z | |
| dc.date.available | 2013-02-28T07:22:15Z | |
| dc.date.issued | 2012-06-15 | |
| dc.description.abstract | A common form of neural network consists of spatially arranged neurons, with weighted connections between the units providing both local excitation and long-range or global inhibition. Such networks, known as soft-winner-take-all networks or lateral-inhibition type neural fields, have been shown to exhibit desirable information-processing properties including balancing the influence of compatible inputs, deciding between incompatible inputs, signal restoration from noisy, weak, or overly strong input, and the ability to be used as trainable building blocks in larger networks. However, the local excitatory connections in such a network are typically hand-wired based on a fixed spatial arrangement which is chosen using prior knowledge of the dimensionality of the data to be learned by such a network, and neuroanatomical evidence is stubbornly inconsistent with these wiring schemes. Here we present a learning rule that allows networks with completely random internal connectivity to learn the weighted connections necessary for implementing the “local” excitation used by these networks, where the locality is with respect to the inherent topology of the input received by the network, rather than being based on an arbitrarily prescribed spatial arrangement of the cells in the network. We use the Siegert approximation to leaky integrate-and-fire neurons, obtaining networks with consistently sparse activity, to which we apply standard Hebbian learning with weight normalization, plus homeostatic activity regulation to ensure full network utilization. Our results show that such networks learn appropriate excitatory connections from the input, and do not require these connections to be hand-wired with a fixed topology as they traditionally have been for decades. | |
| dc.identifier.doi | 10.1109/IJCNN.2012.6252786 | |
| dc.identifier.isbn | 978-1-4673-1489-3 | |
| dc.identifier.issn | 2161-4393 | |
| dc.identifier.scopus | 2-s2.0-84865073476 | |
| dc.identifier.uri | https://www.zora.uzh.ch/handle/20.500.14742/89877 | |
| dc.language.iso | eng | |
| dc.subject.ddc | 570 Life sciences; biology | |
| dc.title | Recurrent competitive networks can learn locally excitatory topologies | |
| dc.type | conference_item | |
| dcterms.accessRights | info:eu-repo/semantics/openAccess | |
| dcterms.bibliographicCitation.journaltitle | Proceedings of the International Joint Conference on Neural Networks | |
| dcterms.bibliographicCitation.originalpublishername | IEEE | |
| dcterms.bibliographicCitation.originalpublisherplace | Brisbane, Australia | |
| dcterms.bibliographicCitation.pageend | 8 | |
| dcterms.bibliographicCitation.pagestart | 1 | |
| dspace.entity.type | Publication | en |
| oairecerif.event.country | Australia | |
| oairecerif.event.endDate | 2012-06-15 | |
| oairecerif.event.place | Brisbane | |
| oairecerif.event.startDate | 2012-06-10 | |
| uzh.contributor.affiliation | ETH Zürich | |
| uzh.contributor.affiliation | University of Zurich | |
| uzh.contributor.affiliation | ETH Zürich | |
| uzh.contributor.author | Jug, Florian | |
| uzh.contributor.author | Cook, Matthew | |
| uzh.contributor.author | Steger, Angelika | |
| uzh.contributor.correspondence | Yes | |
| uzh.contributor.correspondence | No | |
| uzh.contributor.correspondence | No | |
| uzh.document.availability | postprint | |
| uzh.eprint.datestamp | 2013-02-28 07:22:15 | |
| uzh.eprint.lastmod | 2022-01-24 00:22:17 | |
| uzh.eprint.statusChange | 2013-02-28 07:22:15 | |
| uzh.event.presentationType | speech | |
| uzh.event.title | IEEE International Joint Conference on Neural Networks (IJCNN) 2012 | |
| uzh.event.type | conference | |
| uzh.harvester.eth | Yes | |
| uzh.harvester.nb | No | |
| uzh.identifier.doi | 10.5167/uzh-75316 | |
| uzh.jdb.eprintsId | 31739 | |
| uzh.note.public | © 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | |
| uzh.oastatus.unpaywall | green | |
| uzh.oastatus.zora | Green | |
| uzh.publication.citation | Jug, Florian; Cook, Matthew; Steger, Angelika (2012). Recurrent competitive networks can learn locally excitatory topologies. In: IEEE International Joint Conference on Neural Networks (IJCNN) 2012, Brisbane, Australia, 10 June 2012 - 15 June 2012. IEEE, 1-8. | |
| uzh.publication.faculty | science | |
| uzh.publication.originalwork | original | |
| uzh.publication.pageNumber | 8 | |
| uzh.publication.publishedStatus | final | |
| uzh.publication.seriesTitle | Proceedings of the International Joint Conference on Neural Networks | |
| uzh.scopus.impact | 6 | |
| uzh.scopus.subjects | Software | |
| uzh.scopus.subjects | Artificial Intelligence | |
| uzh.workflow.doaj | uzh.workflow.doaj.false | |
| uzh.workflow.eprintid | 75316 | |
| uzh.workflow.fulltextStatus | public | |
| uzh.workflow.revisions | 39 | |
| uzh.workflow.rightsCheck | keininfo | |
| uzh.workflow.status | archive | |
| Files | ||
| Publication available in collections: |