Improving emotion annotation of music using citizen science (ICMPC/ESCOM 2021)
This work was made with Nicolás Gutiérrez-Páez, Lorenzo Porcaro, Aggelos Gkiokas, Perfecto Herrera and Emilia Gómez, and was funded by the TROMPA EU project. It was presented in the 16th International Conference on Music Perception and Cognition (ICMPC/ESCOM).
Abstract
The annotation of emotions is a notably complex task for any type of data. Particularly, annotating emotions in music involves three challenges: a) annotators are usually not aware of the difference between perceived and induced emotions - perceived emotions are those recognized by the listener when interpreting musical properties, while induced emotions involve psycho-physiological responses to music, b) the task is highly demanding for annotators - gamification has been introduced for better engage and incentivise participants, and c) inherent subjectivity demands data reliability analysis - evaluation of datasets typically use internal consistency measures, while more recent content analysis methods have started to be deployed.
You can see a presentation of our work in the following video: