Measuring and Enhancing Expressive Musical Performance with Digital Instruments: Pilot Study and Research Workshop

Lead Research Organisation: Queen Mary University of London
Department Name: Sch of Electronic Eng & Computer Science

Abstract

This collaborative project between members of the Centre for Digital Music (C4DM) at Queen Mary, University of London and the multi-institution AHRC Research Centre for Musical Performance as Creative Practice (CMPCP) addresses the following questions:

* How can digital technology help explain what makes a performance expressive?
* How can an understanding of expressive performance guide the creation of new musical instruments and technologies?

Two emerging trends, if combined, hold the potential for transformative change in musical performance practice. First, the latest digital musical instruments capture the performer's actions with unprecedented detail, allowing continuous, precise control over every aspect of the resulting sound. Second, the study of musical performance as a creative act has taken a central role in musicology, with performers and scholars producing a vibrant interaction between theory and practice.

Music and technology have long been linked, but technology alone is not a driver of musical creativity. By extension, more dimensions of control do not make a digital instrument more expressive, and excessively complex interfaces can even become an impediment to expressive performance. The perspectives of performers and performance scholars are required to shape a new generation of digital instruments that are ideally suited to musicians' creative requirements.

On the other hand, digital technology is indispensable in the measurement and modelling of musical performance. Controlled quantitative studies of performers' actions complement qualitative techniques such as interviews and questionnaires to produce a detailed, balanced picture of performance practice. Digital musical instruments, including traditional acoustic instruments augmented with electronic sensors, provide a valuable data source concerning a performer's physical gestures.

This project promotes collaboration and knowledge exchange between performance scholars and digital music researchers through two main components:

First, a pilot study will be conducted using a sensor-enhanced acoustic piano. The study will focus on the link between expressive intent and physical gesture at the keyboard, and it will serve as a model for future extended cross-disciplinary collaborations. A refereed article will be published on the results, contributing to longstanding debates on the nature of physical keyboard technique (commonly known as "touch").

Second, a research networking event will take place as a special paper session of the Computer Music Modeling and Retrieval (CMMR) conference in June 2012. The event will draw academics, postdocs and students from musicological and technological disciplines with the goal of identifying areas of shared interest. A concert performance will be held afterward with submissions invited from composers, performers and musicologists. Performers and scholars will be encouraged to attend both paper session and performance, promoting a wide range of perspectives at each event. Following the event, the project investigators will draft a document outlining potential areas of collaboration emerging from the session.

The proposed research will be directed by PI Andrew McPherson of C4DM with assistance from a postdoctoral researcher. Co-investigators Elaine Chew of C4DM and Daniel Leech-Wilkinson of King's College London/CMPCP will assist in the design of the pilot study and the organisation of the special session. All members of C4DM and CMPCP will be invited to contribute ideas on how digital technology can model and enhance expressive performance. Long-term impacts emerging from or influenced by this Research Development project include musical instruments that dynamically adapt to the abilities and tastes of the performer, interfaces for non-experts to express themselves musically, new pedagogical techniques and mathematical models of shape, phrasing and gesture in performance.

Planned Impact

The benefits of multidisciplinary research into expressive performance extend beyond academia. Beneficiaries in the commercial private sector include:

* Musical instrument companies. C4DM maintains relationships with manufacturers of both traditional instruments (e.g. Yamaha) and new interfaces (e.g. Focusrite/Novation). The proposed research could result in commercialisation of more expressive digital instruments.

* Musicians and sound artists, who benefit in two ways: (1) the deployment of new performance technologies in concert, and (2) the use of expressive performance models for reflection and self-analysis, potentially improving their skills.

* Computer hardware and software companies involved in creating human interfaces. Creating an intuitive interface is challenging, and understanding the subconscious processes involved in human-instrument musical interaction could inspire new approaches to interface design in other domains.

Beneficiaries in the public sector, third sector and the wider public:

* School music teachers, who can use digital instruments to improve and enhance both classroom curriculum and private lessons.

* Museums, through interactive exhibits exploring music performance (e.g. through live visualisation of sensor data).

* Amateur musicians, who benefit from (1) real-time feedback on their physical technique provided by sensor-equipped instruments, (2) a potentially faster learning curve provided by new instruments that are designed collaboratively with musicians and performance scholars, and (3) novel interfaces for musical expression which may emerge from collaborative study of expressive performance.

Cross-disciplinary activity of the type proposed in this project holds benefits for the UK at large. C4DM and CMPCP are internationally-leading centres in the fields of digital music and performance studies, respectively, and their activities draw students, academics and commercial partners from around the world. Joint efforts between the centres will open up exciting areas of study unique to the UK. What makes a performance expressive and how digital technology can influence it are challenging questions, and answering them will have lasting economic and cultural influence. By forging interdisciplinary collaborations, this proposal represents an initial step toward addressing this fascinating area of research.

Publications

10 25 50
 
Title CMMR concert 
Description Three-concert performance series as part of the Computer Music Modeling and Retrieval conference, 2012, which blended traditional acoustic instruments with digital technology. 
Type Of Art Performance (Music, Dance, Drama, etc) 
Year Produced 2012 
Impact Positive feedback from audiences and follow-up collaborations with participants, including a later project to provide a magnetic resonator piano kit (my digital instrument research) to a well-known UK composer/researcher to be used in a new piece. 
 
Description This project built connections among musicians, musicologists, computer scientists and engineers to study expressive musical performance. Instrument design was a particular focus of the project since it draws on each of these areas.

The project was organised around two activities: an interdisciplinary workshop on expressive performance held at the Computer Music Modeling and Retrieval conference, Queen Mary University of London, in June 2012; and a pilot study of keyboard technique using the TouchKeys multi-touch keyboard technology previously developed by the PI.

The workshop included 10 speakers from a range disciplines, coming from across Europe and North America. Each presented a short paper followed by a brief group discussion. The workshop was complemented by a three-evening series of concerts featuring new instruments and new performance technologies. These were held at the historic Wilton's Music Hall in London.

The pilot study of keyboard technique yielded new insights into the physical motion of the pianist's fingers while playing the keyboard: though sound production on the piano itself is percussive in nature (hammers striking strings), the motions needed to play the piano are by nature continuous. Pianists nearly universally agree that the subtle details of physical motion, known as "touch", are crucial to achieving the right expressive results.

A major finding of the pilot study was how on a new augmented instrument, novel techniques could be integrated alongside familiar ones. Playing the piano is already a complex activity, and it is crucial that any new techniques, such as those made possible by the TouchKeys sensors, do not interfere with traditional playing. We developed new methods of adding vibrato and pitch bends to each note on the keyboard and verified their usability in a study of conservatory pianists.
Exploitation Route The findings on expressive piano touch hold relevance to piano teachers and students as a way of better understanding the subtleties of motion needed to perform at an expert level. The sensor technology and analytical methods could potentially be deployed in the context of piano lessons, for example through visualisation of how the fingers move on the keys.

Insights into expressive piano touch are also highly relevant to instrument designers, where connecting to a performer's existing experience is important to achieving a good reaction. The more a new instrument can draw on existing technique, the faster the path to expertise on that instrument. Where the new instrument introduces completely new techniques, it is important that they fit with the constraints of traditional performance. Our research yielded insights in this area, and they could be put to use by instrument builders through analysis of our motion data (specifically in the case of the keyboard) or in following similar methodologies (applicable to many instruments).

Finally, the TouchKeys multi-touch keyboard, used for the studies, was itself significantly improved as a result of the project, and it has already had wide-ranging commercial and musical impact. Further details are found in the RCUK Narrative Impact section.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software),Culture, Heritage, Museums and Collections

 
Description The pilot study of keyboard technique in this project led to new mappings between motion and sound on the TouchKeys keyboard: these mappings let the performer add vibrato and pitch bends to each note in a way that did not interfere with traditional piano technique. This compatibility between existing experience and new techniques made it possible to develop the TouchKeys into a product suitable for a wider performer community. Over the 2012-2013 period, an improved prototype of the TouchKeys hardware was developed incorporating feedback from user studies (for example: two-dimensional sensing on both black and white keys, and better ergonomics). In July 2013, I launched a Kickstarter crowd-funding campaign [1] to produce and distribute TouchKeys instruments to musicians. Both prebuilt instruments and self-install sensor kits were available. The campaign raised over £46,000, exceeding its goal of £30,000. Instruments were shipped to musicians in 20 countries. The campaign also generated a large amount of publicity, including 25 media article, 7,000+ Facebook likes and 150,000+ YouTube plays. The publicity led to invited presentations at the IRCAM Forum (Paris, November 2013) and the Innovation in Music conference (York, December 2013). Since the instruments have shipped, musicians who bought the TouchKeys have begun uploading their own videos. Examples include using it to simulate the string bending found in many rock guitar solos, and splitting the keys into multiple segments by touch location in order to play microtonal Turkish maqam music. The software has been released open source (GNU Public Licence). In 2015, I launched a second production run of TouchKeys. In total sales of kits and instruments have now totalled £100k. In 2015, QMUL began the process of spinning out the project into an independent company, TouchKeys Instruments Ltd. In December 2016, the company launched a web shop for continuing public sale of the kits and instruments, and remains in ongoing talks with major music manufacturers. [1] https://www.kickstarter.com/projects/instrumentslab/touchkeys-multi-touch-musical-keyboard
Sector Creative Economy,Digital/Communication/Information Technologies (including Software)
Impact Types Cultural,Economic

 
Description International Short Visits
Amount SFr. 3,510 (CHF)
Organisation Swiss National Science Foundation 
Sector Public
Country Switzerland
Start 06/2013 
End 07/2013
 
Description QTech Commercialisation Fund
Amount £30,500 (GBP)
Organisation Queen Mary University of London 
Department Queen Mary Innovation
Sector Private
Country United Kingdom
Start 07/2014 
End 01/2015
 
Description Queen Mary Innovation Fund
Amount £3,500 (GBP)
Organisation Queen Mary University of London 
Department Queen Mary Innovation
Sector Private
Country United Kingdom
Start 04/2013 
End 09/2013
 
Description Jennifer MacRitchie CSI/UWS 
Organisation Conservatory of Italian Switzerland
Country Switzerland 
Sector Academic/University 
PI Contribution Set up capacitive touch sensing technology we developed on an acoustic piano in a music conservatory environment, in order to capture detailed information of the finger motion of expert and student pianists.
Collaborator Contribution Partner conservatory provided access to facilities (rehearsal spaces and piano) and time of piano teachers and students. Research partner also set up high-speed camera system for tracking hand and arm motion at the keyboard. Research collaborator has since moved to University of Western Sydney (2014) where data analysis continues.
Impact J. MacRitchie and A. McPherson. Integrating optical finger motion tracking with surface touch events. Frontiers in Psychology, 2015, 6:702. This project was also supported by a travel grant from the Swiss National Science Foundation.
Start Year 2013
 
Title CONTROL METHODS FOR MUSICAL PERFORMANCE 
Description A method for generating music is provided, the method comprising receiving, on a capacitive touch sensitive interface such as a keyboard, multi-finger gesture inputs having a first component and a second component, wherein the second component has a temporal evolution such as speed; determining the onset of an audio signal, such as a tone, based on the first component, analysing the temporal evolution of the second component to determine MIDI or Open Sound Control OSC instructions; modifying the audio signal based on the instructions, in particular by decoupling the temporal relationship between specific gesture inputs (e.g. at key onset, during a note and upon key release), thus mapping gesture and motion inputs, to thus obtain previously unachievable musical effects with music synthesizers. 
IP Reference WO2015028793 
Protection Patent application published
Year Protection Granted 2015
Licensed No
Impact Commercial licensing discussions are actively underway at the time of writing, but have not formally concluded.
 
Title CONTROL METHODS FOR MUSICAL PERFORMANCE 
Description A method for generating music is provided, the method comprising receiving, on a capacitive touch sensitive interface such as a keyboard, multi-finger gesture inputs having a first component and a second component, wherein the second component has a temporal evolution such as speed; determining the onset of an audio signal, such as a tone, based on the first component, analysing the temporal evolution of the second component to determine MIDI or Open Sound Control OSC instructions; modifying the audio signal based on the instructions, in particular by decoupling the temporal relationship between specific gesture inputs (e.g. at key onset, during a note and upon key release), thus mapping gesture and motion inputs, to thus obtain previously unachievable musical effects with music synthesizers. 
IP Reference US2016210950 
Protection Patent granted
Year Protection Granted 2016
Licensed Yes
Impact Commercialisation being pursued by spinout company TouchKeys Instruments Ltd.
 
Title TouchKeys software 
Description Open source control software for the TouchKeys multi-touch keyboard sensors, which use capacitive sensing to transform any keyboard into an expressive multi-touch surface. 
Type Of Technology Software 
Year Produced 2013 
Open Source License? Yes  
Impact Successful Kickstarter crowd-funding campaign raised £46k to build and distribute instruments to musicians in 20 countries; a second production run commenced in 2015, bringing the total raised to around £100k. Musicians using the instrument have contributed their own videos, and development of the project is ongoing. Invitation to present TouchKeys in several international conferences and venues. 
URL https://code.soundsoftware.ac.uk/projects/touchkeys/
 
Company Name TouchKeys Instruments Ltd. 
Description TouchKeys Instruments Ltd. is a spin-out focused on commercialising the TouchKeys multi-touch keyboard research developed by Andrew McPherson in the Centre for Digital Music at Queen Mary University of London. TouchKeys transforms the piano-style keyboard into an expressive multi-touch control surface using capacitive sensing on the surface of every key. With TouchKeys the player can naturally add vibrato, pitch bends and other forms of continuous note shaping to each note. The company aims to support a license arrangement for the TouchKeys IP to a larger keyboard manufacturer, while also selling self-install sensor kits directly to the public. 
Year Established 2015 
Impact The company was formed in mid-2015 and began commercial licensing discussions with an established European keyboard manufacturer in early 2016. These discussions are still ongoing. In the meantime, in December 2016 the company launched a new web site with an online shop supporting sales of kits and keyboards to the general public. In addition to kit sales, this activity has generated media publicity and led to several invited talks.
Website http://touchkeys.co.uk
 
Description Bela and TouchKeys at Superbooth 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact In April 2017, we presented two research spinout projects at the Superbooth trade show in Berlin. Superbooth is an annual 3-day event aimed at makers and users of modular synthesisers and related music technology, which draws thousands of attendees each year. We hosted a booth featuring two projects: Bela, an open-source embedded platform for audio and sensor processing, which launched on Kickstarter in 2016 and subsequently spun out into a company, Augmented Instruments Ltd; and TouchKeys, a sensor kit which installs onto a piano-style keyboard to add capacitive touch sensing of the location of the fingers. The event also included a workshop where 15 people received a hands-on introduction to using Bela. The response to the event was very positive, with significant media publicity and many conversations with other industry professionals and members of the public. We plan to return for Superbooth 2018.
Year(s) Of Engagement Activity 2017
URL http://2017.superbooth.com/en/
 
Description CMMR workshop 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other academic audiences (collaborators, peers etc.)
Results and Impact Organised an interdisciplinary workshop on expressive performance at the Computer Music Modelling and Retrieval conference, 2012. Brought together participants from scientific, artistic and humanities backgrounds. Paper presentations sparked discussions and helped make connections between participants.

Led to further collaborations with individuals in the workshop and wider interest in my instrument research.
Year(s) Of Engagement Activity 2012
 
Description TouchKeys IRCAM 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? Yes
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Live demo of TouchKeys multi-touch keyboard generated significant audience interest and led to online media coverage.

Interest from public in purchasing their own TouchKeys keyboards.
Year(s) Of Engagement Activity 2013