Abstract
The discrimination of human gestures using wear - able solutions is extremely important as a supporting technique for assisted living, healthcare of the elderly and neurorehabili - tation. This paper presents a mobile electromyography (EMG) analysis framework to be an auxiliary component in physiother - apy sessions or as a feedback for neuroprosthesis calibration. We implemented a framework that allows the integration of multi - sensors, EMG and visual information, to perform sensor fusion and to improve the accuracy of hand gesture recognition tasks. In par ticular, we used an event - based camera adapted to run on the limited computational resources of mobile phones. We introduced a new publicly available dataset of sensor fusion for hand gesture recognition recorded from 10 subjects and used it to train th e recognition models offline. We compare the online results of the hand gesture recognition using the fusion approach with the individual sensors with an improvement in the accuracy of 13% and 11% , for EMG and vision respectively, reaching 85%