Logo image
A Graph Autoencoder Approach for Gesture Classification with Gesture AMR
Conference proceeding

A Graph Autoencoder Approach for Gesture Classification with Gesture AMR

Huma Jamil, Ibrahim Khebour, Kenneth Lai, James Pustejovsky and Nikhil Krishnaswamy
PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL SEMANTICS, pp.41-48
01/01/2025

Abstract

Computer Science Computer Science, Artificial Intelligence Computer Science, Theory & Methods Language & Linguistics Linguistics Science & Technology Social Sciences Technology
We present a novel graph autoencoder (GAE) architecture for classifying gestures using Gesture Abstract Meaning Representation (GAMR), a structured semantic annotation framework for gestures in collaborative tasks. We leverage the inherent graphical structure of GAMR by employing Graph Neural Networks (GNNs), specifically an Edge-aware Graph Attention Network (EdgeGAT), to learn embeddings of gesture semantic representations. Using the EGGNOG dataset, which captures diverse physical gesture forms expressing similar semantics, we evaluate our GAE on a multi-label classification task for gestural actions. Results indicate that our approach significantly outperforms naive baselines and is competitive with specialized Transformer-based models like AMRBART, despite using considerably fewer parameters and no pretraining. This work highlights the effectiveness of structured graphical representations in modeling multi-modal semantics, offering a scalable and efficient approach to gesture interpretation in situated human-agent collaborative scenarios.

Metrics

1 Record Views

Details

Logo image