Universitat Ramon Llull. IQS
2025-01
This study explores children’s emotions through a novel approach of Generative Artificial Intelligence (GenAI) and Facial Muscle Activation (FMA). It examines GenAI’s effectiveness in creating facial images that produce genuine emotional responses in children, alongside FMA’s analysis of muscular activation during these expressions. The aim is to determine if AI can realistically generate and recognize emotions similar to human experiences. The study involves generating a database of 280 images (40 per emotion) of children expressing various emotions. For real children’s faces from public databases (DEFSS and NIMH-CHEFS), five emotions were considered: happiness, angry, fear, sadness, and neutral. In contrast, for AI-generated images, seven emotions were analyzed, including the previous five plus surprise and disgust. A feature vector is extracted from these images, indicating lengths between reference points on the face that contract or expand based on the expressed emotion. This vector is then input into an artificial neural network for emotion recognition and classification, achieving accuracies of up to 99% in certain cases. This approach offers new avenues for training and validating AI algorithms, enabling models to be trained with artificial and real-world data interchangeably. The integration of both datasets during training and validation phases enhances model performance and adaptability.
Article
Published version
English
Generative artificial intelligence; Facial emotion recognition; Facial muscle activation; Artificial neural networks; Intel·ligència artificial; Expressió facial; Emocions en els infants
p.18
MDPI
Big Data Cognitive Computing 2025, 9(1), 15
IQS [794]