Recent evolutions of recurrent neural networks (RNN) such as S4, S4D, and LRU, have shown remarkable potential for very long-range sequence modeling tasks for vision, language, and audio. They have shown a capacity to capture dependencies over tens of thousands of steps. Unlike transformers, which face significant memory con- sumption challenges with large context sizes, they are a promising alternative with their ability to operate effectively on embedded systems. While they have been evaluated for classification and seg- mentation tasks, no work in the literature has applied them in the context of human pose estimation. In this work we propose an architecture that combines such state space models (SSM) to graph attention networks (GAT) to enable their application to evaluate human action tasks on embedded systems.

Evaluation of Human Action Quality with Linear Recurrent Units and Graph Attention Networks on Embedded Systems

Filippo Ziche;Nicola Bombieri
2024-01-01

Abstract

Recent evolutions of recurrent neural networks (RNN) such as S4, S4D, and LRU, have shown remarkable potential for very long-range sequence modeling tasks for vision, language, and audio. They have shown a capacity to capture dependencies over tens of thousands of steps. Unlike transformers, which face significant memory con- sumption challenges with large context sizes, they are a promising alternative with their ability to operate effectively on embedded systems. While they have been evaluated for classification and seg- mentation tasks, no work in the literature has applied them in the context of human pose estimation. In this work we propose an architecture that combines such state space models (SSM) to graph attention networks (GAT) to enable their application to evaluate human action tasks on embedded systems.
2024
Graph Attention Networks
Embedded Systems
inear Recurrent Units
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1125413
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact