Inclusive Design of Immersive Content
Lead Research Organisation:
UNIVERSITY OF CAMBRIDGE
Department Name: Engineering
Abstract
Abstracts are not currently available in GtR for all funded research. This is normally because the abstract was not required at the time of proposal submission, but may be because it included sensitive information such as personal details.
Organisations
Publications





Dudley J
(2023)
Inclusive Immersion: a review of efforts to improve accessibility in virtual reality, augmented reality and the metaverse
in Virtual Reality


Hetzel L
(2021)
Complex Interaction as Emergent Behaviour: Simulating Mid-Air Virtual Keyboard Typing using Reinforcement Learning
in IEEE Transactions on Visualization and Computer Graphics

Description | We have written one complete manuscript that reviews and synthesises the literature around supporting virtual and augemented reality content consumption and presents emerging design principles and a vision we call "Inclusive Immersion". It is currently under final review for the journal Virtual Reality. We have submitted a manuscript based on a survey with 100 respondents with disabilities where we research obstacles and barriers for inclusion when consuming virtual and augemented reality content. A key result here is that most respondents face significant barriers to access such content. This submission is still under review. In adition, we have developed novel means of synthetically generating user data using generative adverserial networks (GANs), which we have demonstrated in two publications (IEEE FG 2021 and IEEE ISMAR 2021). We have further developed a reinforcement learning agent that have learned to simulate a user typing using two index fingers on a mid-air keyboard, generating data that approximates actual user data to a high a degree. Such models will make it easier to construct Inclusive Immersion models later in this project. |
Exploitation Route | On-going analysis of barriers to inclusion can be used issue design guidelines for industry to improve accessibility and possibly be used as background material for legislation and/or further regulation in this area. New ways of synthesising user data for augmented reality headset interactions. |
Sectors | Creative Economy Digital/Communication/Information Technologies (including Software) |
Description | Towards an Equitable Social VR |
Amount | £387,187 (GBP) |
Funding ID | EP/W02456X/1 |
Organisation | Engineering and Physical Sciences Research Council (EPSRC) |
Sector | Public |
Country | United Kingdom |
Start | 01/2023 |
End | 06/2025 |
Title | Mechanical Turk Accessibility (CHI 2021) Dataset |
Description | Includes dataset of two surveys and an interview for a study to understand accessibility and human factors issues in online crowd work. Participants responded to the surveys through Amazon Mechanical Turk and provided information on demographics as well as how they engage in online work on the platform. The dataset contains data from the two surveys and an interview (can be viewed on the different tabs). Garbage data has been removed from the dataset (in survey 2). The publication that emerged from analysing the dataset can be found at https://doi.org/10.1145/3411764.3445291 (in press) |
Type Of Material | Database/Collection of data |
Year Produced | 2021 |
Provided To Others? | Yes |
Impact | None known. |
URL | https://www.repository.cam.ac.uk/handle/1810/316069 |
Title | Research data supporting "Crowdsourcing Design Guidance for Contextual Adaptation of Text Content in Augmented Reality" |
Description | Participant data corresponding to experiments described in "Crowdsourcing Design Guidance for Contextual Adaptation of Text Content in Augmented Reality". The INDEX tab in the attached xlsx datasheet contains a detailed description of the dataset and glossary of terms. |
Type Of Material | Database/Collection of data |
Year Produced | 2021 |
Provided To Others? | Yes |
Impact | None known. |
URL | https://www.repository.cam.ac.uk/handle/1810/321453 |
Title | Research data supporting "Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications" |
Description | Data corresponding to experiments described in "Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications". The zipped folder participant_data.zip contains the raw hand tracking files for each of the eight participants. This includes the fine primitive gesture (for both hands), the gross gestures (right and left hands), the one-handed and two-handed complex gestures, the continuous online recognition data for one-handed and two-handed gestures, as well as the designed primitive and complex gesture if that individual participated in Study 2, the design study. The raw data includes the x, y, z coordinates and the quaternions qx, qy, qz, and qw for each of the time series steps of the hand trajectory. The tracked elements are as follows: cam -> camera palm -> plm wrist -> wrs thumb -> th index -> in middle -> mi ring -> ri pinky -> pi The character "r" or "l" joined to the end of any of the above abbreviations denotes the right and left hands respectively. CHI2021_Gesture_Knitter_supporting_data.xlsx contains the processed data derived from the raw data. It represents the recognition rates for each of the cases examined in our recognition experiments - training with all the primitive data, cross-validation results, as well as the recognition results from the synthetic data generation. The user study questionnaire sheet shows the user responses to the post-study questionnaire conducted in the design study. The decoding text files (decoding_one_hand.txt, decoding_two_hand.txt, and online_recognition.txt) contain the output of the decoder when fed the various complex gesture traces. The number at the end of each decoding output is the edit distance to the correct declaration. The online results show the output of the online recognition experiment with gestures that are misclassified, false activations, or failure to recognize a gesture within that time frame. |
Type Of Material | Database/Collection of data |
Year Produced | 2021 |
Provided To Others? | Yes |
Impact | None known. |
URL | https://www.repository.cam.ac.uk/handle/1810/321454 |
Title | Research data supporting "HotGestures: Complementing Command Selection and Use with Delimiter-Free Gesture-Based Shortcuts in Virtual Reality" |
Description | 1. The HotGestureData.zip contains a dataset used for training the recognizer in the HotGestures paper. The dataset consists of hand skeleton data of 10 gesture classes and the Null gesture class, collected from 8 participants. Each csv file is a gesture clip that records the trajectory of the 1) ABSOLUTE 3D position (x, y, z) and quaternion (x, y, z, w) of the wrist, and 2) positions and rotations of other joints RELATIVE to the wrist. Please read the markdown (.md) file for more details. 2. The HotGestureSource.zip contains the python source code of the recogniser. One can train this recognizer using the dataset above to implement the HotGestures. |
Type Of Material | Database/Collection of data |
Year Produced | 2023 |
Provided To Others? | Yes |
URL | https://www.repository.cam.ac.uk/handle/1810/354385 |
Title | Research data supporting "Understanding, Detecting and Mitigating the Effects of Coactivations in Ten-Finger Mid-Air Typing in Virtual Reality" |
Description | Coactivation data corresponding to the analysis presented in "Understanding, Detecting and Mitigating the Effects of Coactivations in Ten-Finger Mid-Air Typing in Virtual Reality." The attached datasheet contains two tabs: TOUCH EVENT DATA and LAYOUT DATA. TOUCH EVENT DATA details the touch events and their attributes. Feature values are provided for participants 1 to 12. Detailed definitions of these features are provided in the associated publication. LAYOUT DATA details the distribution of touch events and coactivations over the keyboard layout. |
Type Of Material | Database/Collection of data |
Year Produced | 2021 |
Provided To Others? | Yes |
Impact | None known. |
URL | https://www.repository.cam.ac.uk/handle/1810/321455 |
Description | Five ways the metaverse could be revolutionary for people with disabilities |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Magazine article about the research in The Conversation. |
Year(s) Of Engagement Activity | 2022 |
URL | https://theconversation.com/five-ways-the-metaverse-could-be-revolutionary-for-people-with-disabilit... |