Tagging online music contents for emotion. A systematic approach based on contemporary emotion research

Lead Research Organisation: Durham University
Department Name: Music

Abstract

Current approaches to the tagging of music in online databases predominantly rely on music genre and artist name, with music tags being often ambiguous and inexact. Yet, the possibly most salient feature musical experiences is emotion. The few attempts so far undertaken to tag music for mood or emotion lack a scientific foundation in emotion research. The current project proposes to incorporate recent research on music-evoked emotion into the growing number of online musical databases and catalogues, notably the Geneva Emotional Music Scale (GEMS) - a rating measure for describing emotional effects of music recently developed by our group. Specifically, the aim here is to develop the GEMS into an innovative conceptual and technical tool for tagging of online musical content for emotion. To this end, three studies are proposed. In study 1, we will examine whether the GEMS labels and their grouping holds up against a much wider range of musical genres than those that were originally used for its development. In Study 2, we will use advanced data reduction techniques to select the most recurrent and important labels for describing music-evoked emotion. In a third study we will examine the added benefit of the new GEMS compared to conventional approaches to the tagging of music.
The anticipated impact of the findings is threefold. First, the research to be described next will advance our understanding of the nature and structure of emotions evoked by music. Developing a valid model of music-evoked emotion is crucial for meaningful research in the social and in the neurosciences. Second, music information organization and retrieval can benefit from a scientifically sound and parsimonious taxonomy for describing the emotional effects of music. Thus, searches for relevant online music databases need not be longer confined to genre or artist, but can also incorporate emotion as a key experiential dimension of music. Third, a valid tagging scheme for emotion can assist both researchers and professionals in the choice of music to induce specific emotions. For example, psychologists, behavioural economists, and neuroscientists often need to induce emotion in their experiments to understand how behaviour or performance is modulated by emotion. Music is an obvious choice for emotion induction in controlled settings because it is a universal language that lends itself to comparisons across cultures and because it is ethically unproblematic.

Planned Impact

The current tagging initiative would result in the first musical database to have been tagged for emotion in a systematic and principled manner, inspired by contemporary research in psychology. By incorporating the emotion tags in similarity algorithms, music information retrieval and organization could be substantially enriched. This would benefit four types of users.

1. Researchers in the social sciences will gain a deeper understanding of the type and frequency of occurrence of emotion induced by emotion. Moreover, they will be able to search for musical excerpts with a particular emotion profile with unprecedented accuracy and efficiency. This would benefit psychologists, neuroscientists, as well as theatre, film and TV experts, who could search the database for stimuli with specific emotion-inducing properties to be used either in experiments or other research studies in which the effects of emotion play a central role.

2. Music educators are also potential beneficiaries of the results of this project. How can students acquire a broad knowledge of the vast repertoire of musical compositions? This is often achieved by moving through genre, composer, artist, or time period. Yet, a potentially more effective, creative, and perhaps also more engaging way to learn about musical styles, songs and compositions is to navigate the musical space with the compass of emotion as is illustrated in the following paragraph.

3. General public users could benefit from recommendations that are no longer based on surface characteristics such as artist or genre, but on deep emotive responses to music that are shared across the community of users. As mentioned in the proposal this should lead to more intriguing than the prevalent recommendations based on artist or genre. Assume, for example, that Eva Cassidy's "Autumn Leaves" had been most frequently tagged with the GEMS-label "melancholic". The person listening to this song would then find recommendations for a several pieces also predominantly tagged with "melancholic", either within the person's preferred genres, or across types of artists, genres, instruments, and time periods.

4. Online music databases, once suitably tagged for emotion, can be used to infer users' emotional state by online companies. For example, if the search history of an individual listener shows that the listener chose musical items emotionally tagged for interest or wonder, this might predispose the person to seek out novel products, such as technologically advanced goods (mobile phones, tablets, cars). In contrast, if the prevalent state is one of sadness or longing, this might predispose the person to seek out comforting goods (food, restaurants, wellness).

Publications

10 25 50
publication icon
Saari P (2016) Genre-Adaptive Semantic Computing and Audio-Based Modelling for Music Mood Annotation in IEEE Transactions on Affective Computing

 
Description We set out to improve the tagging of music based on emotions. Previous research has either relied on utilitarian, music-specific, or ad-hoc emotion categories for music and utilised narrow and small pools of participants from within Western higher education. The utilitarian emotions consist of categories such as happy, sad, angry, or dimensions such as valence and arousal, all of which do not do full justice to the myriad of moods expressed by music or aesthetic objects in general. The music-specific emotion models such as the Geneva Emotion Music Scale (Zentner et al., 2008) have a robust foundation but refer to emotions aroused by music and thus the relevance of such models for tagging moods in music is not known.

The digital age of music consumption boosted by the popularisation of mobile devices enabling streaming of large music catalogues is increasingly bringing music into everyday activities. Previous research has investigated emotions in music while not taking into account the contexts where music is actually prevalent in everyday life.

First, the project established a new structure of moods expressed by music based on unprecedented amount of mood terms (647 in all), participants (5322 across five continents) and music tracks (4780). In three iterative crowdsourcing experiments, the relevance of all mood terms was established, refined and validated based on co-occurrence analyses of the terms across the tracks. In the first experiment, this operation yielded 88 mood terms, which were collapsed in the second experiment into 14 and 3 factors in a hierarchical fashion. These factors were in turn validated by the third experiment.

Second, the project delineated the links between the moods expressed by music and the typical activities associated with music. To this end, mood terms and music tracks were connected with nine core activities (e.g., daily routines, intellectual, entertainment, physical, etc.) synthesised from the literature. Novel interactions between moods and activities were uncovered, and these links could be represented by coherent and functional factors that allow flexible uses of tags in music information retrieval tasks and services.

Third, the project assessed how well self-selected music tracks and mood tags are conveyed to other listeners. This allowed to investigate the communication accuracy of mood tags, and the robustness of mood structures as representations to increase the reliability of this communication. To this end, the above-mentioned large-scale track data set was formed so that participants both submitted music tracks to the set while associating the tracks to a specific mood term, and tagged tracks submitted by others. In addition to novel findings on the communication of moods, this yielded a novel crowd-sourcing paradigm for evaluating the communication accuracy of moods in music using ecologically valid music examples.

Fourth, the project gained novel insights into the relationship between the perception of moods expressed by music, fitting activities, and contextual attributes related to a music listener. This was achieved by probing into the participants' age, region, gender, music preferences and preferred everyday activities.
Exploitation Route For a non-academic route to impact, the results have two potential ways to impact music business, particularly the companies specialised in music recommendation, music app design, and streaming services (Spotify, Apple music, Last.fm, Deezer, etc.): (1) To enhance music mood retrieval by utilising the multilevel, hierarchical structure of moods in music that is both comprehensive in terms of vocabulary and consistent with online streaming services, and (2) to increase the relevancy of the music mood search by combining activities and personalised information (e.g., age, music preferences, and gender) in the search.

For academic pathways to impact, the data released in the project has the potential to stimulate and push forward the academic research in music information retrieval and music psychology. This dataset is larger and richer than any of the previous datasets in the field, allowing to test and evaluate mood and music models, and to examine the acoustic correlates of the mood structures with large collection of audio tracks, and to select appropriate and ecologically valid stimuli for further studies involving music and emotions.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Leisure Activities, including Sports, Recreation and Tourism

 
Description There is no direct evidence that the findings of the project have been used by the creative economy or digital technologies. We carried out several rounds of negotiations with a non-British (EU) startup company specialising in services involving large-scale music data. They understood the merits of our findings concerning the links between the activities, mood and music, and how they can be harnessed to deliver better recommendation and music retrieval systems. However, we did not agree on the terms of the commercial partnership with the company and thus were not fully able to deliver the economic or digital impact via the company. In a more general level, the findings have put spotlight on the topic in media through more general interest towards music recommendation systems and retrieval, but our main purpose was to carry out rigorous research on the topic, not to deliver societal impact as such.
 
Description Faculty Research Grant
Amount £750 (GBP)
Organisation Durham University 
Sector Academic/University
Country United Kingdom
Start 06/2015 
End 07/2015
 
Title Moods and activities in music (UK ReShare dataset) 
Description This dataset contains tagged data concerning moods and activities in music, as well as music preferences and generic demographic information of the taggers as well and brief music metadata. All data is related to a ESRC funded research project called 'Tagging music for emotions' (ES/K00753X/1). The data has been obtained through a series of three tasks (1. mood and activity tagging, 2. track search and tagging, 3. track tagging for moods and activities) designed to map various moods and activities related to music. These tasks have been run in Crowdflower (a crowdsourcing service) using workers of a specific quality. 2508 workers contributed to the dataset with varying number of rounds (which contains all tasks and consistency check) completed. No personal information was recorded in the initial data collection and the demographics reveal gender, age, country and a raw index of musical preferences, language skill, and musical expertise. The mood terms (88) and activities (9) and genres (113) are the results of pilot experiments, and there is a validation stage of the data as well. A full description of the tasks will be available in the scientific reports, which are currently in preparation. 
Type Of Material Database/Collection of data 
Year Produced 2015 
Provided To Others? Yes  
Impact None yet (1/2/2016) 
URL http://reshare.ukdataservice.ac.uk/852024/
 
Description Music and emotion research between Universities of Durham and Innsbruck 
Organisation University of Innsbruck
Department Department of Psychology
Country Austria 
Sector Academic/University 
PI Contribution Initiating a joint programme to study activities and situations involved in music and emotions.
Collaborator Contribution Knowledge of the pertinent emotion models, sophisticated modelling options, psychological experiments
Impact Joints publications under preparation
Start Year 2014
 
Title Crowdsourcing emotional expression in music 
Description Novel paradigm to obtain crowdsourced annotations of emotional expression of music with simultaneous mapping of activities, tags and musical examples using only music submitted fully by users, not researchers. This was implemented as a dedicated SQL database with connections to 7digital music examples and the crowdsourcing aspect of the project was obtained by connecting this to crowdflower database. The system had internal validity check at regular intervals and a combinations of tasks to keep the crowdsourced workers motivated. 
Type Of Technology Webtool/Application 
Year Produced 2015 
Impact It allowed simultaneous collections of multiple questions which allow not only to estimate the emotional expression in music using a large array of mood terms, but the communication accuracy of such mapping, since one of the tasks involved in asking participants to assess other people example tracks in terms of moods and activities.