The Sit-to-Stand (STS) test is used in clinical practice as an indicator of lower-limb functionality decline, especially for older adults. Due to its high variability, there is no standard approach for categorising the STS movement and recognising its motion pattern. This paper presents a comparative analysis between visual assessments and an automated-software for the categorisation of STS, relying on registrations from a force plate. 5 participants (30 +/- 6 years) took part in 2 different sessions of visual inspections on 200 STS movements under self-paced and controlled speed conditions. Assessors were asked to identify three specific STS events from the Ground Reaction Force, simultaneously with the software analysis: the start of the trunk movement (Initiation), the beginning of the stable upright stance (Standing) and the sitting movement (Sitting). The absolute agreement between the repeated raters' assessments as well as between the raters' and software's assessment in the first trial, were considered as indexes of human and software performance, respectively. No statistical differences between methods were found for the identification of the Initiation and the Sitting events at self-paced speed and for only the Sitting event at controlled speed. The estimated significant values of maximum discrepancy between visual and automated assessments were 0.200 [0.039; 0.361] s in unconstrained conditions and 0.340 [0.014; 0.666] s for standardised movements. The software assessments displayed an overall good agreement against visual evaluations of the Ground Reaction Force, relying, at the same time, on objective measures.

Quantitative Comparison of Human and Software Reliability in the Categorisation of Sit-to-stand Motion Pattern

Simone Battista
Membro del Collaboration Group
;
2021-01-01

Abstract

The Sit-to-Stand (STS) test is used in clinical practice as an indicator of lower-limb functionality decline, especially for older adults. Due to its high variability, there is no standard approach for categorising the STS movement and recognising its motion pattern. This paper presents a comparative analysis between visual assessments and an automated-software for the categorisation of STS, relying on registrations from a force plate. 5 participants (30 +/- 6 years) took part in 2 different sessions of visual inspections on 200 STS movements under self-paced and controlled speed conditions. Assessors were asked to identify three specific STS events from the Ground Reaction Force, simultaneously with the software analysis: the start of the trunk movement (Initiation), the beginning of the stable upright stance (Standing) and the sitting movement (Sitting). The absolute agreement between the repeated raters' assessments as well as between the raters' and software's assessment in the first trial, were considered as indexes of human and software performance, respectively. No statistical differences between methods were found for the identification of the Initiation and the Sitting events at self-paced speed and for only the Sitting event at controlled speed. The estimated significant values of maximum discrepancy between visual and automated assessments were 0.200 [0.039; 0.361] s in unconstrained conditions and 0.340 [0.014; 0.666] s for standardised movements. The software assessments displayed an overall good agreement against visual evaluations of the Ground Reaction Force, relying, at the same time, on objective measures.
2021
Force
Visualization
Standards
Software reliability
Protocols
Particle measurements
Hip
Sensors activity recognition
kinematics
sensor systems
Aged
Biomechanical Phenomena
Humans
Reproducibility of Results
Software
Movement
Torso
File in questo prodotto:
File Dimensione Formato  
09405602 (1).pdf

accesso aperto

Tipologia: Versione dell'editore
Licenza: Creative commons
Dimensione 2.41 MB
Formato Adobe PDF
2.41 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1049968
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact