Sonifying the self: Biometric and real-time data handling for the creation of bespoke music compositions, immersive narratives and new mixed realities
Lead Research Organisation:
University of Manchester
Department Name: Arts Languages and Cultures
Abstract
This project seeks to appropriate current sensor technologies to develop novel computer systems and tools for music composition. Using real-time data, digital signals processing (DSP) and biosensors, music can be composed in bespoke ways through technologies that monitor, sonify and output biological data (biodata). Virtual environments can also be rendered within game-engines to create visual narratives produced via biodata; augmenting music composition through bespoke visuals. Biosensor technology presents novel methods to perform and compose music in real and biometrically rendered virtual environments; both environments can also be amalgamated. Amalgamating real and digital media could offer novel music composition gestures and produce developed immersive narratives due to heightened immersion within game-engines. Thus, probing the amalgamative nature of the real and the virtual creates novel interactive technology for use in music composition and game-engines; it also informs existing research and literature regarding the synthesis of real and virtual spaces. Moreover, sonifying and rendering biodata also provokes several new lines of enquiry and metaphysical questions surrounding bio-digitisation, ethics and existence.
Organisations
People |
ORCID iD |
Ricardo Climent (Primary Supervisor) | |
Christopher Rhodes (Student) |
Publications
Rhodes C
(2020)
New Interfaces and Approaches to Machine Learning When Classifying Gestures within Music
in Entropy