ALIVEmusic - Augmented Live music performance using Immersive Visualisation and Emotion

Lead Participant: SENSING FEELING LIMITED

Abstract

The ALIVEmusic project investigates how live music performances can be enhanced in meaningful ways by Mixed Reality leveraging musical expression and emotional analytics based on Internet of Things and Artificial Intelligence. We adopt a participatory design approach to better understand the needs and expectations of audiences and performers in our digital age and instigate content-driven solutions boosting audience engagement. This project will blend performative data harnessed by smart musical instruments with data characterising the emotional states of performer sand audiences into a cohesive aesthetic visualisation. Through mapping techniques, computer-generated visual content will be dynamically layered over stage elements in real time. Initial prototypes will be assessed enabling audiences unprecedented immersive experiences using either Mixed Reality headsets such as Microsoft HoloLens or Augmented Reality on mobile phones.

Lead Participant

Project Cost

Grant Offer

SENSING FEELING LIMITED £20,857 £ 14,600
 

Participant

QUEEN MARY UNIVERSITY OF LONDON
BE MORE DIGITAL LIMITED
FRACTURE GAMES LIMITED £20,689 £ 14,483
INNOVATE UK
QUEEN MARY UNIVERSITY OF LONDON £17,793 £ 17,793

Publications

10 25 50