Information and neural dynamics in the perception of musical structure

Lead Research Organisation: Goldsmiths University of London
Department Name: Computing Department

Abstract

Music is one of the things that makes us human. No known human society exists without music; and no other species seems to exhibit musical behaviour, in the same sense as humans. It is an open question where music came from (in terms of evolution), but it is self-evident that it arises from the human brain: for there to be music, a brain was involved somewhere, even if only in listening. What is not evident at all is how brains (or the minds to which they give rise) make, or even perceive, music. This project aims to understand how human musical behaviour can be modelled using computers, by building programs which embody theories of how the musical mind works, and then comparing them with humans engaged in musical activity and also by comparing their predictions with those of an expert music analyst. This means that the project will contribute to various areas of study: computer music, statistical methods for cognitive modelling (and therefore to cognitive linguistics, because the same kinds of models can be used there), musicology, and neuroscience (both in a better understanding of brain function and with new methods for neural signal analysis). Long term outcomes are likely to be computer systems that help music education, that play music musically, and that interact with human musicians musically; understanding that helps musicians do what they do more effectively; and understanding that helps brain scientists and psychologists understand more about how the brain and the mind work. Above all, since musicality is so fundamental to humanity, the project aims to help understand some of what it means to be human.

Publications

10 25 50
 
Description We have demonstrated that a particular kind of computer modelling, using Markov chains and information theory, can robustly simulate various aspects of the perception of music. In particular, it is possible to automatically predict the expectation of listeners, and the level of certainty with which they expect it, during a musical melody, and to relate this to various aspects of musical structure and experience, such as phrase segmentation, and emotional response. We have supplied further evidence of the relationship between neural responses (as measured by EEG) and the information theoretic measures used. Overall, we have made substantial progress in understanding the relationship between information in music and its processing in the brain, and the effects on the mind of doing so. We have demonstrated this in the musical context, by showing that the system is capable to some degree of simulating the work of music analysts. We have also shown that the same kind of model can segment linguistic elements correctly, lending weight to the hypothesis that music and language are related in terms of processing.
Exploitation Route The outcomes of the project may be used, in the medium term, in generative music systems and in music education tools. In the very long term, the models of general creativity arising from the project may give rise to more human-like, creative computers. The work carried out here is basic science, and therefore is most likely to be exploited, in the short term, in further research. However, in the longer term, we expect that developments of these systems will be used in automatic or semi-automatic music generators, in music eduction, and as experimental tools for composers. This will be explored in the EU FP7 Lrn2Cre8 project (see Outcomes). On the more speculative level, the work has led to a proposal for a cognitive architecture for creativity, possibly admitting human-like creativity in computers, and this will be further examined in the EU FP7 ConCreTe project (see Outcomes).
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software)

URL http://www.idyom.org
 
Description ConCreTe
Amount £1,931,591 (GBP)
Funding ID ConCreTe 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 10/2013 
End 09/2016
 
Description Learning To Create
Amount £1,931,663 (GBP)
Funding ID Lrn2Cre8 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 10/2013 
End 09/2016
 
Title IDyOM software 
Description This is a suite of software, designed and built by Dr Marcus Pearce, that predicts human expectation during perception of musical melodies. 
Type Of Material Computer model/algorithm 
Year Produced 2013 
Provided To Others? Yes  
Impact Other researchers are using it for their work. 
URL https://code.soundsoftware.ac.uk/projects/idyom-project/files
 
Description IDyOM 
Organisation Goldsmiths, University of London
Department Department of Music
Country United Kingdom 
Sector Academic/University 
PI Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Collaborator Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Impact Multi-disciplinary: Computer Science, Psychology, Music
Start Year 2010
 
Description IDyOM 
Organisation Goldsmiths, University of London
Department Department of Psychology
Country United Kingdom 
Sector Academic/University 
PI Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Collaborator Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Impact Multi-disciplinary: Computer Science, Psychology, Music
Start Year 2010
 
Description IDyOM 
Organisation Queen Mary University of London
Department School of Electronic Engineering and Computer Science
Country United Kingdom 
Sector Academic/University 
PI Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Collaborator Contribution This was a collaborative research project funded by EPSRC. We supplied computer science; partners supplied expertise in music and psychology, respectively.
Impact Multi-disciplinary: Computer Science, Psychology, Music
Start Year 2010
 
Description BBC News interview 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact I was interviewed, along with my doctoral student, Tom Hedges, about production of computer-created music. The article was carried on BBC News (6 o'clock and 10 o'clock) and on BBC World (international). A longer version was run on BBC Radio 4's Today programme.
Year(s) Of Engagement Activity 2015