Datasounds, datasets and datasense: Unboxing the hidden layers between musical data, knowledge and creativity
Lead Research Organisation:
Kingston University
Department Name: The School of Arts
Abstract
This network aims to identify core questions that will drive forward the next phase in data-rich music research, focused in particular on creative music making. The increased availability of digital music data combined with new data science techniques are already opening new possibilities for making, studying and engaging with music. This direction is only likely to speed up upending many current practices, opening up creative avenues and offering new opportunities for research. However, the rapid technological progress with new techniques producing surprising results in rapid succession, is often disconnected from the knowledge and knowhow gained by musicians through creativity, practice and research. By bringing together researchers and practitioners from different underlying disciplines and with a wide range of expertise the network will enable a better foundation for future research. Performers, composers, and improvisers will contribute through embodied knowledge and practice-based methods; researchers in psychology will bring insights about cognitive, affective and behavioural processes underpinning musical experience; and data scientists will add analytical expertise as well as relevant theories, methods and techniques. These will lead to significant conceptual breakthroughs in data driven approaches and technologies applied to music.
The new data-based technologies usually rely on large data sets, they can also produce very large amounts of data. As part of the network activities we will map the limitations of existing music representations, identify the gaps that need to be addressed and propose pathways to improve representation formats. We do not envision developing a single, all encompassing representation that captures the full richness of musical experience. Nevertheless, through the dialogue that this network will facilitate we will be able to outline ways of improving on existing representation formats and develop methods for visualising, analysing, and interpreting large data sets. The network will also consider ethical and legal implications such as how best to address the challenges that Artificial Intelligence (AI) poses to existing musical practices and the fear that this technology induces. Some of these are common to many fields where AI is being applied to tasks which were until very recently the preserve of humans. Music offers a unique perspective on these wider problems - the opacity of 'black box' generative models is a low-risk research challenge not a potentially dangerous tool that may entrench existing injustices. By embedding the ethical dimension into the discussion of the future of data driven music research the network will serve as a model for other fields.
The core activity of the network are two workshops where short presentations will provide a springboard for in-depth discussions; a concert by practitioners with relevant experience will help connect the theoretical discussions to the reality of music making. These will enable a multidimensional exchange of ideas and methods. Material from these workshops will be shared online to document the process and provide a platform to engage wider audiences with the approach taken and the significant results obtained.
Data driven technologies are already having an effect on the way in which we understand, make and consume music within current cultural and economic contexts, raising complex and unprecedented ethical and legal considerations. This network will identify core questions that can propel forward data driven research into creative music making that consider social and individual needs. We will also be able to outline specific research projects that address the shared concerns and bridge the gaps between the different methods that, in many ways, bound our disciplines. The network builds on previous AHRC funded research by the PI (AH/N504531/1 and AH/R004706/1) applying data to creative music making in a particular domain.
The new data-based technologies usually rely on large data sets, they can also produce very large amounts of data. As part of the network activities we will map the limitations of existing music representations, identify the gaps that need to be addressed and propose pathways to improve representation formats. We do not envision developing a single, all encompassing representation that captures the full richness of musical experience. Nevertheless, through the dialogue that this network will facilitate we will be able to outline ways of improving on existing representation formats and develop methods for visualising, analysing, and interpreting large data sets. The network will also consider ethical and legal implications such as how best to address the challenges that Artificial Intelligence (AI) poses to existing musical practices and the fear that this technology induces. Some of these are common to many fields where AI is being applied to tasks which were until very recently the preserve of humans. Music offers a unique perspective on these wider problems - the opacity of 'black box' generative models is a low-risk research challenge not a potentially dangerous tool that may entrench existing injustices. By embedding the ethical dimension into the discussion of the future of data driven music research the network will serve as a model for other fields.
The core activity of the network are two workshops where short presentations will provide a springboard for in-depth discussions; a concert by practitioners with relevant experience will help connect the theoretical discussions to the reality of music making. These will enable a multidimensional exchange of ideas and methods. Material from these workshops will be shared online to document the process and provide a platform to engage wider audiences with the approach taken and the significant results obtained.
Data driven technologies are already having an effect on the way in which we understand, make and consume music within current cultural and economic contexts, raising complex and unprecedented ethical and legal considerations. This network will identify core questions that can propel forward data driven research into creative music making that consider social and individual needs. We will also be able to outline specific research projects that address the shared concerns and bridge the gaps between the different methods that, in many ways, bound our disciplines. The network builds on previous AHRC funded research by the PI (AH/N504531/1 and AH/R004706/1) applying data to creative music making in a particular domain.
Title | AI Improvised Music Duo |
Description | AI Music Improvisation by Franziska Schroeder (sax/AI art) and Federico Reuben (ML/live coding). This a 20-30mins duo live improvised music performance set, using machine listening and AI generated music materials. The PRISM SampleRNN / neural network has been trained, using original saxophone input by Franziska Schroeder. In this duo we incorporate live improvised music, with AI generated source material (i.e. some of the input from the PRISM neural network training data set), combined in a live coding setup. |
Type Of Art | Performance (Music, Dance, Drama, etc) |
Year Produced | 2024 |
Impact | Develop novel methods for integrated improvisation practice |
URL | https://aimc2023.pubpub.org/pub/8x9jxz9a/release/3 |
Title | Duo improvisation between piano and AI-driven computer system |
Description | Pianist David Dolan and composer Oded Ben-Tal (PI for this research) improvising as a duo with Ben-Tal's computer code acting as a semi-autonomous improvising agent. |
Type Of Art | Performance (Music, Dance, Drama, etc) |
Year Produced | 2022 |
Impact | Performing these improvisation stimulates debate around the social meaning of creativity, the role of computers in creative activities and the use of music-data for creating music. |
URL | https://stream.gsmd.ac.uk/View.aspx?id=65262~5i~airjgQkg8l&code=CP~VpKaoDWBMLE2BnP3rxWwUgBABSI9PtMzD... |
Title | Musician's Perspective on Improvisation |
Description | Franziska Schroeder and Federico Reuben using AI in Free Improvisation . This event took place on the invitation by Exploratorium Berlin, 3rd February 2024. We are using a new RAVE model, trained on the saxophone sounds by Franziska Schroeder. The RAVE model (Schroeder model) is combined with live sax sounds and visuals that were generated using StableDiffusion. Featuring Federico Reuben (live coding) and Franziska Schroeder (soprano saxophone and AI visuals). |
Type Of Art | Performance (Music, Dance, Drama, etc) |
Year Produced | 2024 |
Impact | The performance and the symposium it was part contributed to further discussion surrounding the importance of improvisation in the context of data-driven music technology |
URL | https://www.youtube.com/watch?v=Qykht81t32c |
Title | The Odd Couple |
Description | A duo improvisation between pianist David Dolan and an artificial improviser developed by Oded Ben-Tal. The improvisation is based on an expanded tonal-modal idiom but does not conform to a specific musical style. The artificial improviser uses machine listening techniques to extract features from the piano's audio signal, analyses them in real-time, and aims to make musical inferences about the content. This information is used by the system to generate responses that open up space for musical dialogue between the pianist and the system. Ben-Tal is adjusting parameters in the system during the performance to shape larger-scale aspects but the moment-to-moment generation of musical material is done automatically within the computer. |
Type Of Art | Performance (Music, Dance, Drama, etc) |
Year Produced | 2023 |
Impact | New insights about the capacity of data-driven systems to engage human performers and listeners in active music making |
URL | https://youtu.be/ztPmrMBfeFU |
Title | The Odd Couple - Human+AI Real-Time Improvisation |
Description | With this concert, the possibilities of musical improvisation are expanded beyond human limits as concert pianist David Dolan and an artificial improviser developed by composer and researcher Oded Ben-Tal will carry out a duo improvisation. The improvisation is based on an expanded tonal-modal idiom, but does not conform to a specific musical style. The artificial improviser does not try to model tonal-modal thinking or directly imitate Dolan's material; neither is it mainly an extension of Ben-Tal's own musical idiom-it aims to open a space for human-computer musical dialogue by generating novel material that draws on Dolan's music on the one hand and Ben-Tal's compositional sensibilities on the other. The research-creation process is therefore one of joint discovery as both Dolan and Ben-Tal venture beyond their previous musical experiences. |
Type Of Art | Performance (Music, Dance, Drama, etc) |
Year Produced | 2024 |
Impact | Audience members experienced high quality music making through joint improvisation between human and an AI system. The questions surrounding the use of this innovative technology were further explored through dialogue between the performers and with the audience. |
URL | https://www.aesthetics.mpg.de/en/the-institute/events/events/article/the-odd-couple0.html |
Description | Creative musical dialogues between human and machine: a novel approach to studying improvisation and joint action |
Amount | € 135,000 (EUR) |
Organisation | Volkswagen Foundation |
Sector | Charity/Non Profit |
Country | Germany |
Start | 01/2024 |
End | 06/2025 |
Description | Performance partnership between Federico Reuben and Franziska Schroeder |
Organisation | Queen's University Belfast |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | This new collaboration came as a result of the researchers meeting at the first workshop (30/4/2022) organised by this network. The performance they developed through a research process makes innovative use of data tools to create an audio-visual performance illustrating the capacity of data to enable music creativity. The first performance took place during the second workshop organised by the network - 21/11/2022 in Stockholm, Sweden. |
Collaborator Contribution | The two collaborators Dr. Reuben and Prof. Schroeder developed their improvisation-based performance through discussions and rehearsals. Each of them is making use of data tools to open new opportunities for creating sounds and visuals. The research-creation process combines artistic exploration and technological development in tandem and through collaboration. |
Impact | This collaboration resulted in a public performance in concert. The collaboration is multi-disciplinary straddling music, technology (computing and AI), and visual art. |
Start Year | 2022 |
Description | Practice-based research collaboration between Oded Ben-Tal and David Dolan |
Organisation | Guildhall School of Music & Drama |
Country | Afghanistan |
Sector | Academic/University |
PI Contribution | The collaboration was enabled by the participation of Prof. Dolan in the first workshop organised by the network (York, 30/4/2022). Following the workshop Dolan and Ben-Tal (PI on the network) began developing collaborative research project on human-AI improvisation. The first performance resulting from this research took place as part of the second workshop organised by the network (Stockholm, 21/11/2022). |
Collaborator Contribution | Prof. Dolan is bringing his extensive expertise in improvisation - as an improvising pianist when he performs with the AI system; as a researcher on improvisation and creativity when we develop the research; and from his pedagogical experience as a teacher of improvisation. |
Impact | Outputs so far are performances of human-AI improvisation: 21/11/2022 in Stockholm; 3/2/2023 in London. The project is multi-disciplinary: music and computing/AI. |
Start Year | 2022 |
Description | Datasounds - when music meets AI |
Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Created and released a documentary film about the research of the "Datasounds, Datasets and Datasense Research Network" into music, music data and the future that AI could bring to music composition, creation and curation. The film includes interviews, extracts from events and performances and places the work in relation to current public debates. |
Year(s) Of Engagement Activity | 2022,2023,2024 |
URL | https://www.youtube.com/watch?v=siGwKzqflrY |
Description | Interview for The Today programme on BBC radio 4 |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | Dr. Ben-Tal (network PI) was interviewed about his recent work with music AI. He was asked for his opinion on this fast developing technology and its impact on society and culture. |
Year(s) Of Engagement Activity | 2023 |
Description | Music, Data, Live |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | We hosted an evening event which included presentation about various aspects of our work, live performances with data capture and open discussion with attending audiences about the implication of the research for the future. |
Year(s) Of Engagement Activity | 2022,2023 |