Future Human-Machine Interfaces (HMIs) for Automated Vehicles
Lead Research Organisation:
University of Nottingham
Department Name: Faculty of Engineering
Abstract
Vehicle automation is heralded as a technology that can save lives on the road, improve traffic efficiency, reduce driver stress and so on - but can also disengage the driver from tasks related to driving. This issue will be critical to the success or otherwise of automation due to the many situations in which the vehicle is likely to require human intervention (levels 2/3) or the human occupant decides they wish to drive (levels 2-5).
This PhD project will aim to develop guidance for future in-vehicle experiences which can equip drivers to take or hand over control when required/desired. In a highly novel methodology, adaptations will be made to the HFRG driving simulator to enable the vehicle to be driven from either the left/right side of the front cockpit space (i.e. two sets of steering wheels/pedals). This will allow investigation into how two people 'naturally' communicate (both verbally and/or non-verbally) with each other whilst completing a series of driving activities, including taking/relinquishing control of the vehicle. As such, a range of human-human verbal/gestural interactions will emerge to inform the design of future HMIs for automated vehicles, including natural language and augmented reality interfaces.
This PhD project will aim to develop guidance for future in-vehicle experiences which can equip drivers to take or hand over control when required/desired. In a highly novel methodology, adaptations will be made to the HFRG driving simulator to enable the vehicle to be driven from either the left/right side of the front cockpit space (i.e. two sets of steering wheels/pedals). This will allow investigation into how two people 'naturally' communicate (both verbally and/or non-verbally) with each other whilst completing a series of driving activities, including taking/relinquishing control of the vehicle. As such, a range of human-human verbal/gestural interactions will emerge to inform the design of future HMIs for automated vehicles, including natural language and augmented reality interfaces.
Organisations
People |
ORCID iD |
Emily Shaw (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/N50970X/1 | 01/10/2016 | 30/09/2021 | |||
2124785 | Studentship | EP/N50970X/1 | 01/10/2018 | 31/03/2023 | Emily Shaw |
EP/R513283/1 | 01/10/2018 | 30/09/2023 | |||
2124785 | Studentship | EP/R513283/1 | 01/10/2018 | 31/03/2023 | Emily Shaw |