Deep learning-enabled video-based motion tracking for human-robot interaction in assembly tasks
Lead Research Organisation:
Queen's University Belfast
Department Name: Sch of Electronics, Elec Eng & Comp Sci
Abstract
Manufacturing environments of the future will involve humans interacting with robots at close distance. Optimisation of these interactions will require continuous recordings of human movements, to feed into the control algorithms governing the robot movements. Although commercial motion tracking systems are available, these cannot practically be applied in most manufacturing environments, since these are costly and require sensors to be attached to the body at precisely specified positions. Modern computational tools for extracting patterns from images may provide a viable alternative. In recent years, machine learning has been used to extract body parts and body postures from images of human or animal bodies. When applied to videos - sequences of images - this method potentially affords extracting 3D body movements without having to attach sensors to the body. This, however, is not yet optimized to afford the aforementioned applications and the current PhD project will address several outstanding challenges in this field. It will for instance focus on improving training data based on video recordings synchronised with state-of-the-art full body motion tracking and appropriating the algorithms to account for motion patterns across frames.
Organisations
People |
ORCID iD |
Seán McLoone (Primary Supervisor) | |
SAMUEL ADEBAYO (Student) |
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/T518074/1 | 30/09/2020 | 29/09/2025 | |||
2446548 | Studentship | EP/T518074/1 | 30/09/2020 | 31/03/2024 | SAMUEL ADEBAYO |