A Study of Multimodal Guitar Augmentation for Gestural Expressivity

Lead Research Organisation: Queen Mary University of London
Department Name: Sch of Electronic Eng & Computer Science


EPSRC : Adan Benito : EP/S022694/1

The invention of the electric guitar at the beginning of the 20th century sparked a musical and cultural revolution that changed, not only how music was made and listened to, but also how new instruments were designed. However, new trends in popular music and in both electronic and digital advances, could be causing what many media outlets have called the demise of the guitar as a driver for cultural change.

Although the electric guitar has widely been valued as a vehicle for musical expression, many of the efforts that have been carried out to bring the instrument up to date with the era of digital music fail to capture that richness. Attempts at creating interfaces based on the guitar using either sensors or analysis of the sound produced by the instrument have been carried out in the past but, given the complexity of the analysis of musical gestures and the translation of expressive meaning, these methods have never been widely adopted.

With this research we propose to validate an augmented instrument prototype based on a guitar with sensors and an hexaphonic pickup to detect different kinds of string bending, a technique commonly used by guitarist to raise the pitch of the string to add articulation to their playing. This gesture has been popularly said to provide singing and talking qualities to guitar playing.

We will extract information from both the audio signals produced by the instrument and sensors that detect string movement to obtain enhanced information enhanced information on how guitarists use this resource. We will then analyse how performers interact with the instrument during performance to assess the importance of this technique and to generate data that would later be used to better understand the characteristics of string bending as an expressive tool.


10 25 50
Description During the course of this research project, we have created a prototype toolkit for analysing specific idiomatic gestures of guitar playing and manipulating the way performers interact with the instrument via these resources.

As a result of the process of developing the software and technical toolkit that accompanies the instrument, we encountered some of the shortcomings of heuristic analysis of musical gestures that are common to many previous approaches on augmented digital musical instrument (DMI) design. One of these limitations is reflected by the fact that determining the boundaries of musical gestures in a real-time scenario is not trivial and that any possible errors in gesture prediction has the effect of disrupting the player's expectations and might render the augmentation unusable in a performative context.

Despite having successfully used the same bend sensing technology present on the prototype in a study on the offline analysis of gestures for the identification of different kinds of articulation [1] correlating these with pitch contours and onsets extracted from a recorded signal, being able to predict gestures in real-time in order to control a sound processing engine poses specific challenges. Constrains on the processing capabilities of embedded hardware and the tight requirements of real-time interaction require using short-term analysis techniques that often fail to provide information about the actual state of the gesture that is being analysed and might yield ambiguous predictions. Moreover, since string displacement can be related to different simultaneous interactions (i.e. plucking or bending the strings), simple one-to-one mappings from conditioned signals extracted from the sensors to musical parameters (as they might be commonly found in many DMIs) are insufficient to differentiate between gestures.

Within this project, we developed a multimodal approach that takes advantage of efficient pitch detection and transient processing algorithms combined with information from string displacement to isolate specific gestures in real-time which would allow crafting mappings associated with particular articulations. However, lightweight online pitch detection algorithms more commonly used in embedded DMIs are prone to octave errors and miss-predictions, which affects their reliability and therefore that of the prediction. A rule-based correction that employs prior knowledge of the instrument's capabilities and constraints for pitch manipulation during specific gestures combined with analysis of the state of the sensors was designed in order to increase reliability in the separation of musical gestures.

This project highlights how finding the right set of mappings that provide accurate information regarding performance parameters is still a challenging task. Even when complementary information is provided via sensing modalities and computational power is available to run more powerful signal analysis algorithms in real-time, determining what constitutes a specific gesture requires a great extent of heuristic engineering. While simplistic mappings might struggle to capture enough information, complex crafted mappings lack flexibility and might be thrown astray by any small error on the extraction of any feature involved in the mapping.

These findings open the door to the exploration of probabilistic techniques for designing intermediate mappings and control signals for augmented DMIs based around models of interaction learnt from multimodal data. Following this project, we intend to put these ideas forward using a data-driven approach for creating flexible probabilistic models that can be adapted to similar designs for augmented guitar and possibility other DMIs.

[1] A. Benito Temprano and A. McPherson. A TMR angle sensor for gesture acquisition and disambiguation on the electric guitar. Audio Mostly, Trento, Italy. 2021.
Exploitation Route The series of heuristic techniques developed during this project to combine sensor data and audio features within a real-time environment could easily be adopted and expanded by DMI designers that focus on augmenting interaction to enable novel ways of gestural control.

Furthermore, the toolkit could be used as provided by composers and performers looking to exploit new gestural capabilities and disambiguation as creative resources for the guitar.

On the other hand, researchers looking to gather enhanced datasets with information of performative gestures for the analysis of guitar playing and other musical information retrieval tasks could also benefit from these developments. Similar techniques could be applied to instruments other than the guitar.
Sectors Creative Economy,Digital/Communication/Information Technologies (including Software)

Title Prototype of augmented guitar equipped with bend sensors and of accompanying hexaphonic processing unit 
Description A physical prototype of an augmented guitar equipped with hexaphonic output, including a sensing pickup capable of capturing horizontal displacement of the guitar strings was constructed as part of this project. The guitar was made compatible with any 13-pin hexaphonic system, but a processing unit was also designed to allow real-time capture of gestures from the sensing pickup and low latency hexaphonic processing. This unit consists of a 1U half-rack box containing three Bela Mini sub-units (https://learn.bela.io/products/bela-boards/bela-mini/). Each sub-unit is dedicated to processing a pair of strings and the corresponding bending signals. The outputs of these are summed together so that the box can be used with any common guitar processing chain that accepts a stereo or mono line-level signal. The sub-units can communicate with each other using a serial protocol over UART or a custom parallel digital interface for exchanging synchronous messages from one Bela mini to the other two. In addition, a software toolkit was also developed for the extraction of pitch and bend information from each string, the analysis of transients and the isolation of specific bend gestures using heuristic rules based on bend sensor data and extracted audio features. Some proposed augmentation strategies were added to the toolkit including the use of bend gestures for pitch manipulation and virtual feedback generation. 
Type Of Technology Physical Model/Kit 
Year Produced 2021 
Impact The prototype and toolkit present a new opportunity for the analysis of performative gestures on the guitar and enable new augmentations that tap into specific performative aspects of nuanced gestures which would otherwise be difficult to extract from the sole analysis of audio features. This provides a tool for the creation of augmented datasets for guitar performance and a creative device for both DMI designers and players to explore how changing the behaviour of specific gestures can be used in a performative context.