Integrated robotic system for characterisation and decommissioning

Abstract

This project will deliver an advanced mobile manipulator robot for nuclear decommissioning, based on a customised combination of proven industrial robotics hardware, state-of-the-art control algorithms, and advanced interfaces. The robot will comprise a ruggedised mobile platform, carrying advanced manipulator arms, and a variety of tooling (hands/grippers, cutting tools, pressure hose). State-of-the-art AI and machine vision algorithms will be informed by a diverse sensor suite (vision, radiological, thermal), to provide 3D characterisation, and semi-autonomous control. Prior to decommissioning, characterisation is needed. Our team has demonstrated how state-of-the-art machine learning and computer vision (UoB - all partners defined in later sections) can provide real-time 3D reconstruction of scenes, while simultaneously recognising and labelling materials (concrete, metal, wood, ceramics) and waste-like objects (rubber gloves, cans, pipe-work, hoses). We have also (lead partner BRL) demonstrated how this information can then be used for efficient navigation amongst the characterised materials and objects in those scenes. This will be augmented by combining vision sensors with radiation, contaminant, thermal and other sensors, to automatically annotate 3D models of scenes with rich characterisation data to inform decommissioning planning, and real-time monitoring during remote operations. These will be combined with advanced finite element analysis models (partner NNL), enabling planning and risk analysis (e.g. cutting of a component could have wider structural implications). Following characterisation, decommissioning interventions will require grasping, cutting, and manipulating parts of legacy plant, as well as decontamination (scabbling, grinding or pressure spraying). Our team is at the forefront of research in advanced control of remote robots for performing such actions. While conventional direct teleoperation must always be available to the human operator, our research has shown that incorporating elements of advanced autonomous robot control via an advanced Human-Machine Interface (HMI), so as to provide variable autonomy as an operator-assistance technology, can improve safety, speed, and efficiency whilst greatly reducing stress and workload. Our team has already demonstrated AI and vision-guided robot arms in UK nuclear industry sites: 1) human-supervised autonomous robot grasping of (inactive) waste-simulant objects (@Workington) – NNL & UoB; 2) human-supervised autonomous laser-cutting of radioactive metal (Preston active cave) – NNL & partner ARM. This project will extend these methods from large, fixed manipulators, to versatile mobile-manipulators (robotic arms mounted on robot vehicles). We will integrate advanced manipulation methods with our state-of-the-art vehicle navigation system, which allows a human operator to dynamically select between different levels of autonomy (LOA), ranging from direct joystick control to fully autonomous navigation. Phase 1 will involve demos at BRL using existing robots drawn from our team’s equipment resources. In Phase 2, NNL will supervise design and build of an inactive plant-representative testing arena, within one of the new 200sqm. spaces being specially dedicated for this purpose at BRL, which will be used to demo, evaluate and benchmark the Phase 2 robot.

Publications

10 25 50