Assessing upper limb (UL) movements post-stroke is crucial to monitor and understand sensorimotor recovery. Recently, several research works focused on the relationship between reach-to-target kinematics and clinical outcomes. Since, conventionally, the assessment of sensorimotor impairments is primarily based on clinical scales and observation, and hence likely to be subjective, one of the challenges is to quantify such kinematics through automated platforms like inertial measure- ment units, optical, or electromagnetic motion capture systems. Even more challenging is to quantify UL kinematics through non-invasive systems, to avoid any influence or bias in the measurements. In this context, tools based on video cameras and deep learning software have shown to achieve high levels of accuracy for the estimation of the human pose. Nevertheless, an analysis of their accuracy in measuring kinematics features for the Finger-to-Nose Test (FNT) is missing. We first present an extended quantitative evaluation of such inference software (i.e., OpenPose) for measuring a clinically meaningful set of UL movement features. Then, we propose an algorithm and the corresponding software implementation that automates the segmentation of the FNT movements. This allows us to automat- ically extrapolate the whole set of measures from the videos with no manual intervention. We measured the software accuracy by using an infrared motion capture system on a total of 26 healthy and 26 stroke subjects.

On the pose estimation software for measuring movement features in the finger-to-nose test

Enrico Martini;Nicola Vale';Michele Boldo;Anna Righetti;Nicola Smania;Nicola Bombieri
2022-01-01

Abstract

Assessing upper limb (UL) movements post-stroke is crucial to monitor and understand sensorimotor recovery. Recently, several research works focused on the relationship between reach-to-target kinematics and clinical outcomes. Since, conventionally, the assessment of sensorimotor impairments is primarily based on clinical scales and observation, and hence likely to be subjective, one of the challenges is to quantify such kinematics through automated platforms like inertial measure- ment units, optical, or electromagnetic motion capture systems. Even more challenging is to quantify UL kinematics through non-invasive systems, to avoid any influence or bias in the measurements. In this context, tools based on video cameras and deep learning software have shown to achieve high levels of accuracy for the estimation of the human pose. Nevertheless, an analysis of their accuracy in measuring kinematics features for the Finger-to-Nose Test (FNT) is missing. We first present an extended quantitative evaluation of such inference software (i.e., OpenPose) for measuring a clinically meaningful set of UL movement features. Then, we propose an algorithm and the corresponding software implementation that automates the segmentation of the FNT movements. This allows us to automat- ically extrapolate the whole set of measures from the videos with no manual intervention. We measured the software accuracy by using an infrared motion capture system on a total of 26 healthy and 26 stroke subjects.
2022
9781665481496
Finger-to-nose test
Upper limb movements
Human motion estimation
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1070648
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact