In the context of smart environments, crafting re- mote monitoring systems that are efficient, cost-effective, user- friendly, and respectful of privacy is crucial for many scenar- ios. Recognizing and tracing individuals via markerless motion capture systems in multi-person settings poses challenges due to obstructions, varying light conditions, and intricate interactions among subjects. Nevertheless, methods based on data gathered by Inertial Measurement Units (IMUs) located in wearables grapple with other issues, including the precision of the sensors and their optimal placement on the body. We then argue that more accurate results can be achieved by mixing human pose estimation (HPE) techniques with information collected by wearables. Thus, this paper introduces a real-time platform to track and identify per- sons by fusing HPE and IMU data. It exploits a matching model that consists of two synergistic components: the first employs a geometric approach, correlating orientation, acceleration, and velocity readings from the input sources, while the second utilizes a Convolutional Neural Network (CNN) to yield a correlation coefficient for each HPE and IMU data pair. The proposed platform achieves promising results in tracking and identification, with an accuracy rate of 96.9%.

Real-Time Multi-Person Identification and Tracking via HPE and IMU Data Fusion

Mirco De Marchi;Cristian Turetta;Graziano Pravadelli;Nicola Bombieri
2024-01-01

Abstract

In the context of smart environments, crafting re- mote monitoring systems that are efficient, cost-effective, user- friendly, and respectful of privacy is crucial for many scenar- ios. Recognizing and tracing individuals via markerless motion capture systems in multi-person settings poses challenges due to obstructions, varying light conditions, and intricate interactions among subjects. Nevertheless, methods based on data gathered by Inertial Measurement Units (IMUs) located in wearables grapple with other issues, including the precision of the sensors and their optimal placement on the body. We then argue that more accurate results can be achieved by mixing human pose estimation (HPE) techniques with information collected by wearables. Thus, this paper introduces a real-time platform to track and identify per- sons by fusing HPE and IMU data. It exploits a matching model that consists of two synergistic components: the first employs a geometric approach, correlating orientation, acceleration, and velocity readings from the input sources, while the second utilizes a Convolutional Neural Network (CNN) to yield a correlation coefficient for each HPE and IMU data pair. The proposed platform achieves promising results in tracking and identification, with an accuracy rate of 96.9%.
2024
Human tracking
Data fusion
Human pose estimation
Wearables
IMU
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1115846
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact