UZH-Logo

Maintenance Infos

Balancing pencils using spike-based vision sensors


Conradt, J; Berner, R; Lichtsteiner, P; Douglas, R J; Delbruck, T; Cook, M (2009). Balancing pencils using spike-based vision sensors. In: Bernstein Conference on Computational Neuroscience 2009 (BCCN 2009), Frankfurt am Main, DE, 30 September 2009 - 2 October 2009, online.

Abstract

Animals by far outperform current technology when reacting to visual stimuli in low processing requirements, demonstrating astonishingly fast reaction times to changes. Current real-time vision based robotic control approaches, in contrast, typically require high computational resources to extract relevant information from sequences of images provided by a video camera. Most of the information contained in consecutive images is redundant, which often turns the vision processing algorithms into a limiting factor in high-speed robot control. As an example, robotic pole balancing with large objects is a well known exercise in current robotics research, but balancing arbitrary small poles (such as a pencil, which is too small for a human to balance) has not yet been achieved due to limitations in vision processing.

At the Institute of Neuroinformatics we developed an analog silicon retina (http://siliconretina.ini.uzh.ch), which, in contrast to current video cameras, only reports individual events ("spikes") from individual pixels when the illumination changes within the pixel's field of view. Transmitting only the "on" and "off" spike events, instead of transmitting full vision frames, drastically reduces the amount of data processing required to react to environmental changes. This information encoding is directly inspired by the spike based information transfer from the human eye to visual cortex.

In our demonstration, we address the challanging problem of balancing an arbitrary standard pencil, based solely on visual information. A stereo pair of silicon retinas reports vision events caused by the moving pencil, which is standing on its tip on an actuated table. Then our processing algorithm extracts the pencil position and angle without ever using a "full scene" visual representation, but simply by processing only the spikes relevant to the pencil's motion.

Our system uses neurally inspired hardware and a neurally inspired form of communication to achieve a difficult goal.

Animals by far outperform current technology when reacting to visual stimuli in low processing requirements, demonstrating astonishingly fast reaction times to changes. Current real-time vision based robotic control approaches, in contrast, typically require high computational resources to extract relevant information from sequences of images provided by a video camera. Most of the information contained in consecutive images is redundant, which often turns the vision processing algorithms into a limiting factor in high-speed robot control. As an example, robotic pole balancing with large objects is a well known exercise in current robotics research, but balancing arbitrary small poles (such as a pencil, which is too small for a human to balance) has not yet been achieved due to limitations in vision processing.

At the Institute of Neuroinformatics we developed an analog silicon retina (http://siliconretina.ini.uzh.ch), which, in contrast to current video cameras, only reports individual events ("spikes") from individual pixels when the illumination changes within the pixel's field of view. Transmitting only the "on" and "off" spike events, instead of transmitting full vision frames, drastically reduces the amount of data processing required to react to environmental changes. This information encoding is directly inspired by the spike based information transfer from the human eye to visual cortex.

In our demonstration, we address the challanging problem of balancing an arbitrary standard pencil, based solely on visual information. A stereo pair of silicon retinas reports vision events caused by the moving pencil, which is standing on its tip on an actuated table. Then our processing algorithm extracts the pencil position and angle without ever using a "full scene" visual representation, but simply by processing only the spikes relevant to the pencil's motion.

Our system uses neurally inspired hardware and a neurally inspired form of communication to achieve a difficult goal.

Downloads

51 downloads since deposited on 13 Mar 2010
17 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Speech), refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Event End Date:2 October 2009
Deposited On:13 Mar 2010 20:39
Last Modified:05 Apr 2016 13:59
Related URLs:http://bccn2009.org/ (Organisation)
http://www.ini.uzh.ch/node/24027
Permanent URL: http://doi.org/10.5167/uzh-31958

Download

[img]
Preview
Filetype: PDF
Size: 1MB

TrendTerms

TrendTerms displays relevant terms of the abstract of this publication and related documents on a map. The terms and their relations were extracted from ZORA using word statistics. Their timelines are taken from ZORA as well. The bubble size of a term is proportional to the number of documents where the term occurs. Red, orange, yellow and green colors are used for terms that occur in the current document; red indicates high interlinkedness of a term with other terms, orange, yellow and green decreasing interlinkedness. Blue is used for terms that have a relation with the terms in this document, but occur in other documents.
You can navigate and zoom the map. Mouse-hovering a term displays its timeline, clicking it yields the associated documents.

Author Collaborations