Header

UZH-Logo

Maintenance Infos

Prediction of manipulation actions


Fermüller, Cornelia; Wang, Fang; Yang, Yezhou; Zampogiannis, Konstantinos; Zhang, Yi; Barranco, Francisco; Pfeiffer, Michael (2016). Prediction of manipulation actions. arXiv: Computer Vision and Pattern Recognition 1610.00759, Institute of Neuroinformatics.

Abstract

Looking at a person's hands one often can tell what the person is going to do next, how his/her hands are moving and where they will be, because an actor's intentions shape his/her movement kinematics during action execution. Similarly, active systems with real-time constraints must not simply rely on passive video-segment classification, but they have to continuously update their estimates and predict future actions. In this paper, we study the prediction of dexterous actions. We recorded from subjects performing different manipulation actions on the same object, such as "squeezing", "flipping", "washing", "wiping" and "scratching" with a sponge. In psychophysical experiments, we evaluated human observers' skills in predicting actions from video sequences of different length, depicting the hand movement in the preparation and execution of actions before and after contact with the object. We then developed a recurrent neural network based method for action prediction using as input patches around the hand. We also used the same formalism to predict the forces on the finger tips using for training synchronized video and force data streams. Evaluations on two new datasets showed that our system closely matches human performance in the recognition task, and demonstrate the ability of our algorithm to predict what and how a dexterous action is performed.

Abstract

Looking at a person's hands one often can tell what the person is going to do next, how his/her hands are moving and where they will be, because an actor's intentions shape his/her movement kinematics during action execution. Similarly, active systems with real-time constraints must not simply rely on passive video-segment classification, but they have to continuously update their estimates and predict future actions. In this paper, we study the prediction of dexterous actions. We recorded from subjects performing different manipulation actions on the same object, such as "squeezing", "flipping", "washing", "wiping" and "scratching" with a sponge. In psychophysical experiments, we evaluated human observers' skills in predicting actions from video sequences of different length, depicting the hand movement in the preparation and execution of actions before and after contact with the object. We then developed a recurrent neural network based method for action prediction using as input patches around the hand. We also used the same formalism to predict the forces on the finger tips using for training synchronized video and force data streams. Evaluations on two new datasets showed that our system closely matches human performance in the recognition task, and demonstrate the ability of our algorithm to predict what and how a dexterous action is performed.

Statistics

Citations

3 citations in Microsoft Academic

Downloads

24 downloads since deposited on 26 Jan 2017
15 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Working Paper
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Date:2016
Deposited On:26 Jan 2017 11:41
Last Modified:02 Feb 2018 11:45
Series Name:arXiv: Computer Vision and Pattern Recognition
OA Status:Green
Free access at:Related URL. An embargo period may apply.
Related URLs:https://arxiv.org/abs/1610.00759 (Organisation)

Download

Download PDF  'Prediction of manipulation actions'.
Preview
Content: Published Version
Filetype: PDF
Size: 3MB