Designing Interaction Freedom via Active Inference
Lead Research Organisation:
University of Glasgow
Department Name: School of Computing Science
Abstract
Problem:
Reliable design of interactive systems using advanced sensors and machine learning (ML) is an unsolved problem. New sensors could expand how we interact with computers, but are still hard to design for, without overly constraining user behaviour.
AI algorithms can reduce human workload, but can fail in complex contexts and can control, deskill and dis-empower people. We have no principled workflows for designing interaction to allow users to flexibly share autonomy with supporting AI.
Objectives:
Integrate Active Inference theory into the human-computer interaction loop, linking human behaviour via sensors and ML/inference embeddings with dynamic mediating mechanisms to create end-to-end mutually adaptive loops between humans and systems.
Develop novel interaction mechanisms for explicit and implicit control of AI autonomy levels to empower people via shared autonomy, while maintaining their agency.
Create systematic, composable software tools for computational interaction design which support prototyping and analysis of ML-infused sensors coupled with humans, which can integrate probabilistic causal models to solve inverse problems with advanced sensors, adapting to closed-loop data.
Impact:
Using ML to give users freedom to express themselves individually, we can be robust to user heterogeneity, ensure fairness for diverse users and enable creative uses of technologies. Our tools will form the foundation of future usable interfaces with novel sensors and rich data spaces, and advances can be shared and rapidly built on, transforming HCI research workflows.
Applications:
1. Whole hand touch interaction via soft, programmable electronic skin, for personaliseable interfaces with novel forms, and prosthetics.
2. Google's Soli radar for 3D gesture, pose and proxemic interaction,
3. Radio Frequency sensing of humans for context-aware interaction.
Reliable design of interactive systems using advanced sensors and machine learning (ML) is an unsolved problem. New sensors could expand how we interact with computers, but are still hard to design for, without overly constraining user behaviour.
AI algorithms can reduce human workload, but can fail in complex contexts and can control, deskill and dis-empower people. We have no principled workflows for designing interaction to allow users to flexibly share autonomy with supporting AI.
Objectives:
Integrate Active Inference theory into the human-computer interaction loop, linking human behaviour via sensors and ML/inference embeddings with dynamic mediating mechanisms to create end-to-end mutually adaptive loops between humans and systems.
Develop novel interaction mechanisms for explicit and implicit control of AI autonomy levels to empower people via shared autonomy, while maintaining their agency.
Create systematic, composable software tools for computational interaction design which support prototyping and analysis of ML-infused sensors coupled with humans, which can integrate probabilistic causal models to solve inverse problems with advanced sensors, adapting to closed-loop data.
Impact:
Using ML to give users freedom to express themselves individually, we can be robust to user heterogeneity, ensure fairness for diverse users and enable creative uses of technologies. Our tools will form the foundation of future usable interfaces with novel sensors and rich data spaces, and advances can be shared and rapidly built on, transforming HCI research workflows.
Applications:
1. Whole hand touch interaction via soft, programmable electronic skin, for personaliseable interfaces with novel forms, and prosthetics.
2. Google's Soli radar for 3D gesture, pose and proxemic interaction,
3. Radio Frequency sensing of humans for context-aware interaction.
People |
ORCID iD |
Roderick Murray-Smith (Principal Investigator) |
Publications

Fischer F
(2024)
SIM2VR: Towards Automated Biomechanical Testing in VR

Kaul C
(2024)
AI-Enabled Sensor Fusion of Time-of-Flight Imaging and mmWave for Concealed Metal Detection.
in Sensors (Basel, Switzerland)

Murray-Smith, R.
(2024)
Active Inference and Human-Computer Interaction

Description | Google Advanced Engineering |
Organisation | |
Country | United States |
Sector | Private |
PI Contribution | joint research on radar and machine learning |
Collaborator Contribution | provision of radar hardware, and problem domain |
Impact | none yet |
Start Year | 2020 |
Description | Radboud cooperation |
Organisation | Radboud University Nijmegen |
Country | Netherlands |
Sector | Academic/University |
PI Contribution | Collaboration with Prof. Roel Vertegaal at Radboud |
Collaborator Contribution | Joint discussions and joint funding proposal completed. |
Impact | Funding proposal completed. |
Start Year | 2025 |