Volumetric assets and immersive environments depicting future impacts of rising sealevels on the Broads and East Anglia coastline.

Lead Research Organisation: Norwich University of the Arts
Department Name: Research and Knowledge Transfer

Abstract

This proposal builds upon four related research projects: Norwich University of the Arts Impact Case Study, Public engagement with the Norfolk Broads, enhancing public understanding and engagement with nature conservation (2019); Visualisations of space debris (Off Earth. NIXON. 2021); Building Platform Technologies for Symbiotic Creativity in Hong Kong (NIXON Et al, 2021) and Future Cinema Systems (NIXON Et al, 2022).

Working in close partnership with the Broads Authority and using archive material and maps contained in the Strategic Flood Risk Assessment report 2018 (a detailed study of the impacts of rising sea levels and effects of Climate Change), using 3D modelling, and volumetric video, I will create realistic and immersive visualisations, which reimagine past, present, and future landscapes accurately, depicting the Broads and East Anglia coastal areas under threat from rising sea levels.

Practice-based research will be presented in visually aesthetic and immersive form within a specialist visualisation system, where viewers can experience the East Anglia and Broads landscape by moving through a virtual world, viewing realistic representations which will be augmented with flood risk visualisations, enabling audiences and stakeholders to experience, explore and consider the effects of flooding upon the Broads and East Anglia coastline. The specialised screen and sound system will provide the viewer with enhanced immersion, creating a greater sense of embodiment, and presence.

This research combines latest volumetric capture and advanced 360-degree screen and surround sound technologies to create a platform for immersive content development and viewing. Through the integration of these technologies and powered by artificial intelligence, deep learning, virtual reality, augmented reality, interactive narrative and generative aesthetics, the system will deliver innovation. Working together they will create a new interactive and immersive architecture for participant spectators. The system will be a vital resource for practice-based research and experimentation to explore archive content, aesthetic, and cultural domains, generate digital assets for training and education in health, sport and architecture and areas where the accurate simulation of real-life situations is applicable.

The system will enable audiences to explore the landscapes of the past, present, and future with greater immersion measured by considering image quality, 3D vision, field of view, tracking level, sound quality, user perspective and resolution. The system contains highly immersive characteristics, thus providing greater sensations of embodiment, and through viewing autonomy will provide the viewer with enhanced sensations of presence, in virtual environments, presence is understood as place illusion-the qualia of being located inside the virtual word (Slater, 2009).

To achieve the goal of introducing new artistic expression and content through the system I will focus on leveraging important developments on machine learning, and generative models, providing a platform for artists and designers working with Generative Adversarial Networks and Creative Adversarial Networks.

The project will facilitate greater collaboration between creative practitioners, and the technology sector.The outcomes will include new creative media practice, and immersive and interactive image experiences which can be exploited by the scientific, creative, and cultural industries and provide innovative ways of engaging in global challenges.

Publications

10 25 50