In this study, we have explored the field of emotion recognition, which has been gaining increasing interest for both the study of emotional control disorders and human-computer interaction. The majority of emotion recognition studies in the literature use visual or audio-visual stimuli, leaving the audioonly component often overlooked. In this study, we analyzed data from an experiment on emotion recognition induced by music. By processing signals from the electrocardiogram, respiration, and skin conductance, we identified features that were highly relevant for separating four emotions (i.e., joy, anger, sadness, pleasure). We then created a machine learning model to classify the four emotions using only three of the most relevant features selected based on a recent feature selection method utilizing a graphical visualization. The results showed an average accuracy of 85% in classifying the four emotions, and more than 90% of accuracy for classifying arousal and valence. This study highlights the potential of using only audio stimuli for emotion recognition and the effectiveness of music as a tool for studying emotional control.
Decoding Emotions through Music: A Physiological Analysis of Emotion Recognition
Paglialonga A;
2023
Abstract
In this study, we have explored the field of emotion recognition, which has been gaining increasing interest for both the study of emotional control disorders and human-computer interaction. The majority of emotion recognition studies in the literature use visual or audio-visual stimuli, leaving the audioonly component often overlooked. In this study, we analyzed data from an experiment on emotion recognition induced by music. By processing signals from the electrocardiogram, respiration, and skin conductance, we identified features that were highly relevant for separating four emotions (i.e., joy, anger, sadness, pleasure). We then created a machine learning model to classify the four emotions using only three of the most relevant features selected based on a recent feature selection method utilizing a graphical visualization. The results showed an average accuracy of 85% in classifying the four emotions, and more than 90% of accuracy for classifying arousal and valence. This study highlights the potential of using only audio stimuli for emotion recognition and the effectiveness of music as a tool for studying emotional control.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.