Header

UZH-Logo

Maintenance Infos

Event-based silicon retinas and cochleas


Delbruck, T; Liu, S-C (2012). Event-based silicon retinas and cochleas. In: Barth, F G; Humphrey, J A C; Srinivasan, M V. Frontiers in Sensing - From Biology to Engineering. Wien, Austria: Springer, 87-100.

Abstract

This chapter reviews neuromorphic silicon retinas and cochleas that are based on the structure and operation of their biological counterparts. These devices are built using conventional chip fabrication technologies, using transistor circuits that emulate neural computations from biology. In first generation sensors, the analog outputs of every cell were read out serially at fixed sample rates. The new generation of sensors reports only the outputs of active cells through digital events (spikes) that are communicated asynchronously. Such sensors respond more quickly with reduced power consumption. Their digital “address-event” outputs rapidly convey precise timing information about the scene that is only attained from conventional sensors if they are continuously sampled at high rates. The sparseness, low latency, and spatio-temporal structure of this new form of sensor output data can benefit subsequent post-processing algorithms. Tradeoffs in the design of neuromorphic visual and auditory sensors are discussed. Examples are given of vision algorithms that process the address-events, using their spatio-temporal coherence, for low-level feature extraction and for object tracking.

Abstract

This chapter reviews neuromorphic silicon retinas and cochleas that are based on the structure and operation of their biological counterparts. These devices are built using conventional chip fabrication technologies, using transistor circuits that emulate neural computations from biology. In first generation sensors, the analog outputs of every cell were read out serially at fixed sample rates. The new generation of sensors reports only the outputs of active cells through digital events (spikes) that are communicated asynchronously. Such sensors respond more quickly with reduced power consumption. Their digital “address-event” outputs rapidly convey precise timing information about the scene that is only attained from conventional sensors if they are continuously sampled at high rates. The sparseness, low latency, and spatio-temporal structure of this new form of sensor output data can benefit subsequent post-processing algorithms. Tradeoffs in the design of neuromorphic visual and auditory sensors are discussed. Examples are given of vision algorithms that process the address-events, using their spatio-temporal coherence, for low-level feature extraction and for object tracking.

Statistics

Altmetrics

Additional indexing

Item Type:Book Section, not refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Date:2012
Deposited On:06 Mar 2013 08:14
Last Modified:05 Apr 2016 16:37
Publisher:Springer
ISBN:978-3-211-99749-9
Publisher DOI:https://doi.org/10.1007/978-3-211-99749-9_6
Related URLs:http://www.springer.com/biomed/neuroscience/book/978-3-211-99748-2
http://opac.nebis.ch/F/?local_base=NEBIS&CON_LNG=GER&func=find-b&find_code=SYS&request=006716061

Download

Full text not available from this repository.
View at publisher

TrendTerms

TrendTerms displays relevant terms of the abstract of this publication and related documents on a map. The terms and their relations were extracted from ZORA using word statistics. Their timelines are taken from ZORA as well. The bubble size of a term is proportional to the number of documents where the term occurs. Red, orange, yellow and green colors are used for terms that occur in the current document; red indicates high interlinkedness of a term with other terms, orange, yellow and green decreasing interlinkedness. Blue is used for terms that have a relation with the terms in this document, but occur in other documents.
You can navigate and zoom the map. Mouse-hovering a term displays its timeline, clicking it yields the associated documents.

Author Collaborations