Incremental learning of dynamic hand gestures in human-robot interaction

Other authors

Universitat Politècnica de Catalunya. Departament d'Enginyeria de Sistemes, Automàtica i Informàtica Industrial

Garrell Zulueta, Anais

Publication date

2024-07-01

Abstract

Human-robot interaction (HRI) has advanced rapidly in recent years, driven by advances in artificial intelligence, robotics, and cognitive sciences. The goal of HRI is to enable more intuitive and natural relationships between humans and robots. Dynamic hand gestures (DHG) are a new strategy for accomplishing this aim that is being incorporated into HRI systems. DHG are a wide range of continuous movements and variations in hand shape, orientation, and trajectory that humans utilize for communication, as opposed to static hand gestures, which entail maintaining particular hand poses. The ability of humans to teach robots previously unseen gestures is the focus of this research, as it allows robots to accurately adapt to novel expressions. This process demonstrates the symbiotic relationship between humans and robots, as new gestures are added to the robots' collection through advanced learning algorithms and perceptual mechanisms. Models that use Incremental Learning (IL) strategies continuously learn new information without losing what they have already learnt. In order to reduce the need for extra hardware and resources, this research looks into methods for hand gesture recognition and suggests using threedimensional hand keypoint position extraction. The study intends to improve the capabilities of current Dynamic Hand Gesture Recognition (DHGR) models for realtime scenarios through testing and evaluation, providing insights into bettering the dynamics of human-robot interaction. The validation of the model is accomplished through an extensive set of simulations and real-life experiments.

Document Type

Master thesis

Language

English

Publisher

Universitat Politècnica de Catalunya

Recommended citation

This citation was generated automatically.

Rights

Open Access

This item appears in the following Collection(s)