Header

UZH-Logo

Maintenance Infos

Sensor fusion using EMG and vision for hand gesture classification in mobile applications


Ceolini, Enea; Taverni, Gemma; Khacef, Lyes; Payvand, Melika; Donati, Elisa (2019). Sensor fusion using EMG and vision for hand gesture classification in mobile applications. In: 2019 IEEE Biomedical Circuits and Systems Conference (BioCAS), Nara, Japan, 17 October 2019 - 19 October 2019.

Abstract

The discrimination of human gestures using wear - able solutions is extremely important as a supporting technique for assisted living, healthcare of the elderly and neurorehabili - tation. This paper presents a mobile electromyography (EMG) analysis framework to be an auxiliary component in physiother - apy sessions or as a feedback for neuroprosthesis calibration. We implemented a framework that allows the integration of multi - sensors, EMG and visual information, to perform sensor fusion and to improve the accuracy of hand gesture recognition tasks. In par ticular, we used an event - based camera adapted to run on the limited computational resources of mobile phones. We introduced a new publicly available dataset of sensor fusion for hand gesture recognition recorded from 10 subjects and used it to train th e recognition models offline. We compare the online results of the hand gesture recognition using the fusion approach with the individual sensors with an improvement in the accuracy of 13% and 11% , for EMG and vision respectively, reaching 85%

Abstract

The discrimination of human gestures using wear - able solutions is extremely important as a supporting technique for assisted living, healthcare of the elderly and neurorehabili - tation. This paper presents a mobile electromyography (EMG) analysis framework to be an auxiliary component in physiother - apy sessions or as a feedback for neuroprosthesis calibration. We implemented a framework that allows the integration of multi - sensors, EMG and visual information, to perform sensor fusion and to improve the accuracy of hand gesture recognition tasks. In par ticular, we used an event - based camera adapted to run on the limited computational resources of mobile phones. We introduced a new publicly available dataset of sensor fusion for hand gesture recognition recorded from 10 subjects and used it to train th e recognition models offline. We compare the online results of the hand gesture recognition using the fusion approach with the individual sensors with an improvement in the accuracy of 13% and 11% , for EMG and vision respectively, reaching 85%

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

103 downloads since deposited on 05 Dec 2019
93 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Conference or Workshop Item (Paper), not_refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Scopus Subject Areas:Physical Sciences > Artificial Intelligence
Physical Sciences > Biomedical Engineering
Physical Sciences > Electrical and Electronic Engineering
Physical Sciences > Instrumentation
Language:English
Event End Date:19 October 2019
Deposited On:05 Dec 2019 09:39
Last Modified:01 Oct 2020 00:02
Publisher:IEEE
ISBN:9781509006175
OA Status:Green
Publisher DOI:https://doi.org/10.1109/biocas.2019.8919210

Download

Green Open Access

Download PDF  'Sensor fusion using EMG and vision for hand gesture classification in mobile applications'.
Preview
Content: Accepted Version
Filetype: PDF
Size: 1MB
View at publisher