Publication:

A Theoretical Framework for Target Propagation

Date

Date

Date
2020
Conference or Workshop Item
Published version
cris.lastimport.scopus2025-06-08T03:30:44Z
dc.contributor.institutionUniversity of Zurich
dc.date.accessioned2021-02-03T10:30:49Z
dc.date.available2021-02-03T10:30:49Z
dc.date.issued2020-12-12
dc.description.abstract

The success of deep learning, a brain-inspired form of AI, has sparked interest in understanding how the brain could similarly learn across multiple layers of neurons. However, the majority of biologically-plausible learning algorithms have not yet reached the performance of backpropagation (BP), nor are they built on strong theoretical foundations. Here, we analyze target propagation (TP), a popular but not yet fully understood alternative to BP, from the standpoint of mathematical optimization. Our theory shows that TP is closely related to Gauss-Newton optimization and thus substantially differs from BP. Furthermore, our analysis reveals a fundamental limitation of difference target propagation (DTP), a well-known variant of TP, in the realistic scenario of non-invertible neural networks. We provide a first solution to this problem through a novel reconstruction loss that improves feedback weight training, while simultaneously introducing architectural flexibility by allowing for direct feedback connections from the output to each hidden layer. Our theory is corroborated by experimental results that show significant improvements in performance and in the alignment of forward weight updates with loss gradients, compared to DTP.

dc.identifier.scopus2-s2.0-85101288584
dc.identifier.urihttps://www.zora.uzh.ch/handle/20.500.14742/179560
dc.language.isoeng
dc.subject.ddc570 Life sciences; biology
dc.title

A Theoretical Framework for Target Propagation

dc.typeconference_item
dcterms.accessRightsinfo:eu-repo/semantics/openAccess
dcterms.bibliographicCitation.originalpublishernameNeurIPS
dspace.entity.typePublicationen
oairecerif.event.countryCanada / Virtual Conference
oairecerif.event.endDate2020-12-12
oairecerif.event.placeVancouver
oairecerif.event.startDate2020-12-06
uzh.contributor.authorMeulemans, A
uzh.contributor.authorCarzaniga, F
uzh.contributor.authorSuykens, J
uzh.contributor.authorSacramento, J
uzh.contributor.authorGrewe, B F
uzh.contributor.correspondenceYes
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.contributor.correspondenceNo
uzh.document.availabilitypublished_version
uzh.eprint.datestamp2021-02-03 10:30:49
uzh.eprint.lastmod2022-09-21 07:16:40
uzh.eprint.statusChange2021-02-03 10:30:49
uzh.event.presentationTypepaper
uzh.event.title34th Conference on Neural Information Processing Systems (NeurIPS 2020)
uzh.event.typeconference
uzh.harvester.ethYes
uzh.harvester.nbNo
uzh.identifier.doi10.5167/uzh-198834
uzh.oastatus.zoraGreen
uzh.publication.citationMeulemans, A; Carzaniga, F; Suykens, J; Sacramento, J; Grewe, B F (2020). A Theoretical Framework for Target Propagation. In: 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada / Virtual Conference, 6 December 2020 - 12 December 2020, NeurIPS.
uzh.publication.freeAccessAtUNSPECIFIED
uzh.publication.originalworkoriginal
uzh.publication.publishedStatusfinal
uzh.scopus.impact41
uzh.workflow.eprintid198834
uzh.workflow.fulltextStatuspublic
uzh.workflow.revisions26
uzh.workflow.rightsCheckkeininfo
uzh.workflow.statusarchive
Files

Original bundle

Name:
NeurIPS-2020-a-theoretical-framework-for-target-propagation-Paper_(1).pdf
Size:
852.05 KB
Format:
Adobe Portable Document Format
Publication available in collections: