UZH-Logo

Maintenance Infos

Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users


Gonzalez, J; Hernandez Arieta, A; Yu, W (2010). Multichannel audio biofeedback for dynamical coupling between prosthetic hands and their users. Industrial Robot: An International Journal, 37(2):148-156.

Abstract

It is widely agreed that amputees have to rely on visual input to monitor and control the position of the prosthesis while reaching and grasping because of the lack of proprioceptive feedback. Therefore, visual information has been a prerequisite for prosthetic hand biofeedback studies. This is why, the underlying characteristics of other artificial feedback methods used to this day, such as auditive, electro-tactile, or vibro-tactile feedback, has not been clearly explored. The purpose of this paper is to explore whether it is possible to use audio feedback alone to convey more than one independent variable (multichannel) simultaneously, without relying on the vision, to improve the learning of a new perceptions, in this case, to learn and understand the artificial proprioception of a prosthetic hand while reaching.
Experiments are conducted to determine whether the audio signals could be used as a multi-variable dynamical sensory substitution in reaching movements without relying on the visual input. Two different groups are tested, the first one uses only audio information and the second one uses only visual information to convey computer-simulated trajectories of two fingers.

It is widely agreed that amputees have to rely on visual input to monitor and control the position of the prosthesis while reaching and grasping because of the lack of proprioceptive feedback. Therefore, visual information has been a prerequisite for prosthetic hand biofeedback studies. This is why, the underlying characteristics of other artificial feedback methods used to this day, such as auditive, electro-tactile, or vibro-tactile feedback, has not been clearly explored. The purpose of this paper is to explore whether it is possible to use audio feedback alone to convey more than one independent variable (multichannel) simultaneously, without relying on the vision, to improve the learning of a new perceptions, in this case, to learn and understand the artificial proprioception of a prosthetic hand while reaching.
Experiments are conducted to determine whether the audio signals could be used as a multi-variable dynamical sensory substitution in reaching movements without relying on the visual input. Two different groups are tested, the first one uses only audio information and the second one uses only visual information to convey computer-simulated trajectories of two fingers.

Altmetrics

Downloads

67 downloads since deposited on 17 Jan 2011
12 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Language:English
Date:2010
Deposited On:17 Jan 2011 10:22
Last Modified:05 Apr 2016 14:35
Publisher:Emerald
ISSN:0143-991X
Publisher DOI:https://doi.org/10.1108/01439911011018920
Other Identification Number:1574
Permanent URL: https://doi.org/10.5167/uzh-42436

Download

[img]
Preview
Filetype: PDF
Size: 1MB
View at publisher

TrendTerms

TrendTerms displays relevant terms of the abstract of this publication and related documents on a map. The terms and their relations were extracted from ZORA using word statistics. Their timelines are taken from ZORA as well. The bubble size of a term is proportional to the number of documents where the term occurs. Red, orange, yellow and green colors are used for terms that occur in the current document; red indicates high interlinkedness of a term with other terms, orange, yellow and green decreasing interlinkedness. Blue is used for terms that have a relation with the terms in this document, but occur in other documents.
You can navigate and zoom the map. Mouse-hovering a term displays its timeline, clicking it yields the associated documents.

Author Collaborations