Header

UZH-Logo

Maintenance Infos

Markerless motion tracking to quantify behavioral changes during robot-assisted gait training: A validation study


van Dellen, Florian; Hesse, Nikolas; Labruyère, Rob (2023). Markerless motion tracking to quantify behavioral changes during robot-assisted gait training: A validation study. Frontiers in Robotics and AI, 10:1155542.

Abstract

Introduction: Measuring kinematic behavior during robot-assisted gait therapy requires either laborious set up of a marker-based motion capture system or relies on the internal sensors of devices that may not cover all relevant degrees of freedom. This presents a major barrier for the adoption of kinematic measurements in the normal clinical schedule. However, to advance the field of robot-assisted therapy many insights could be gained from evaluating patient behavior during regular therapies.
Methods: For this reason, we recently developed and validated a method for extracting kinematics from recordings of a low-cost RGB-D sensor, which relies on a virtual 3D body model to estimate the patient's body shape and pose in each frame. The present study aimed to evaluate the robustness of the method to the presence of a lower limb exoskeleton. 10 healthy children without gait impairment walked on a treadmill with and without wearing the exoskeleton to evaluate the estimated body shape, and 8 custom stickers were placed on the body to evaluate the accuracy of estimated poses.
Results & Conclusion: We found that the shape is generally robust to wearing the exoskeleton, and systematic pose tracking errors were around 5 mm. Therefore, the method can be a valuable measurement tool for the clinical evaluation, e.g., to measure compensatory movements of the trunk.

Abstract

Introduction: Measuring kinematic behavior during robot-assisted gait therapy requires either laborious set up of a marker-based motion capture system or relies on the internal sensors of devices that may not cover all relevant degrees of freedom. This presents a major barrier for the adoption of kinematic measurements in the normal clinical schedule. However, to advance the field of robot-assisted therapy many insights could be gained from evaluating patient behavior during regular therapies.
Methods: For this reason, we recently developed and validated a method for extracting kinematics from recordings of a low-cost RGB-D sensor, which relies on a virtual 3D body model to estimate the patient's body shape and pose in each frame. The present study aimed to evaluate the robustness of the method to the presence of a lower limb exoskeleton. 10 healthy children without gait impairment walked on a treadmill with and without wearing the exoskeleton to evaluate the estimated body shape, and 8 custom stickers were placed on the body to evaluate the accuracy of estimated poses.
Results & Conclusion: We found that the shape is generally robust to wearing the exoskeleton, and systematic pose tracking errors were around 5 mm. Therefore, the method can be a valuable measurement tool for the clinical evaluation, e.g., to measure compensatory movements of the trunk.

Statistics

Citations

Dimensions.ai Metrics
2 citations in Web of Science®
3 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

2 downloads since deposited on 19 Jan 2024
2 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:04 Faculty of Medicine > University Children's Hospital Zurich > Medical Clinic
Dewey Decimal Classification:610 Medicine & health
Scopus Subject Areas:Physical Sciences > Computer Science Applications
Physical Sciences > Artificial Intelligence
Language:English
Date:6 March 2023
Deposited On:19 Jan 2024 12:01
Last Modified:30 Apr 2024 01:47
Publisher:Frontiers Research Foundation
ISSN:2296-9144
OA Status:Gold
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.3389/frobt.2023.1155542
PubMed ID:36950282
  • Content: Published Version
  • Language: English
  • Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)