A Neuromorphic Control System for Agile Biped Walking

Lead Research Organisation: Middlesex University
Department Name: Faculty of Science & Technology

Abstract

Rush-hour in a London mainline railway station: a passenger effortlessly walks swiftly through the swirling crowd, looking at a large screen 20 meters away, talking on a mobile phone in his left hand, holding a cup of coffee in his right hand, avoiding collision with anyone, and making his way to platform 14.

This seems effortless. But from the robotics view, this is almost miraculous because all these tasks are controlled by a single brain, efficiently in parallel. To behave like this human, today's robot would have to use a large million-dollar super-computer or several connected computers using Kilowatts of energy, and the performance would still not comparable to that of a human brain. To build a robot brain by reverse engineering the human or animal brain has been the ultimate goal of many large inter-disciplinary projects in recent years. Today, the most promising technology to physically and structurally emulate the brain is neuromorphic engineering, which uses electronic circuits to mimic neuro-biological architectures. Compared with standard computer-based controllers, neuromorphic controllers are naturally parallel, more compact and more energy efficient. It is widely thought that a neuromorphic brain will be the centre of the next generation of intelligent autonomous robots.

Many studies in neuromorphic engineering have developed neuromorphic systems to realize specific functional modules of the brain, e.g., hearing, vision, olfaction, cognition, and action learning. The proposed project is targeting another fundamental control function of the human brain -- bipedal (two-legged) walking. Just like humans and animals, a robot must be able to move agilely in order to execute its tasks in the natural environment. But, compared with traditional counterparts, the performance of the neuromorphically controlled legged robots (especially biped robots) is very poor in terms of versatile and agile locomotion. This is mainly because their neuromorphic circuits emulated only the basic function module of the spinal neural network, which could only realize propulsion control. In animals, propulsion control and body posture control are fully integrated, which is fundamental for their agile locomotion in a complex natural environment. Particularly, in humans, to meet the functional requirements of agile bipedal walking, the spinal neural network is heavily modulated by the supraspinal levels. However, it is still not fully understood in biology how the neuronal modules at the spinal level and supraspinal level interact with and modulate each other in the control of human bipedal locomotion.

Building on the team's track record in biped robotics, neuromorphic circuit design, neuromorphic simulation, and computational neuroscience, the proposed project aims to fill this gap via developing a multi-module and multi-level (i.e., spinal level and supraspinal level) neuromorphic system. In the neuromorphic system in this project, we will implement the functions of three neuronal modules that have been known to play important roles in human locomotion control. By coupling such a neuromorphic system with a purposely designed biped robot using a new method (model-driven concurrent integration), we will be able to explore the unknown interaction/modulation mechanisms between these modules that could lead to agile biped walking.

At the heart of our proposal is the ambition to make a notable step forward in the area of neuromorphic robotics. This project will, for the first time, demonstrate an agile 3D biped robot that has human-like walking patterns and a neuromorphic control mechanism.

Planned Impact

Our project could have academic and societal impacts in the following three key areas.

1. Shedding new insights on the neuronal control mechanism of human locomotion

The rapidly advancing robotics technologies are providing new tools for testing biological hypotheses or uncovering complex mechanisms in biological systems, which would be difficult or impossible using conventional methods. For example, biologists have long marvelled over the complex social behaviours seen among insects such as bees and ants, where different groups of individuals specialize in different tasks and cooperate perfectly. Now, using swarm robotics and evolutionary robotics, a team of roboticists has given a new explanation of how the specialization and cooperation mechanism may have evolved from simple behaviours. Likewise, there are puzzling questions regarding the control mechanism of humans' bipedal locomotion. It is thought that human walking may be controlled by three neural modules: Central Pattern Generators (CPGs), reflexes, and an internal model. But we don't know how these modules connect/interact with each other to generate versatile and agile biped walking. This is a critical issue in neuroscience and evolutionary biology, because bipedalism is the most fundamental human characteristic that separated the first hominids from the rest of the four-legged apes. The lack of knowledge in detailed circuits of the brain should not prevent us from probing this issue using robotics. One central task of this project is to explore this issue using a multi-module-multi-level neuromorphic system coupled with a sophisticated biped robot. At the end of the project, if the neuromorphic system involving those three neural modules successfully generates agile biped walking in our robot, the interaction/modulation mechanism developed in this project will suggest new testable hypothesis on the interaction mechanism of these modules in human locomotion control. These new insights will be particularly useful for developing smart next-generation neuro-prosthesis.

2. Developing sophisticated neuromorphic robots using a systematic integration approach.

The neural circuits and the body mechanics of animal locomotor system have evolved together for millions of years. This has led to a fully integrated hierarchical neuro-mechanical system, where the neural circuits are closely coupled with the muscular system at various levels and time scales. Similarly, a future neuromorphic robot may have to have an integrated multi-module-multi-level architecture (like the system in this project), if it is to be capable of sophisticated dynamic motion, such as the agile biped walking in this project. Therefore, system integration will be a central issue in the design of such a robot, which can't be solved by the ad hoc or heuristic integration approaches widely used in the design of today's neuromorphic robots and biomimetic robots. The "model-driven concurrent integration" approach proposed in this project is aiming to systematically solve this critical issue in the emerging area of neuromorphic robotics. This will inspire researchers to develop new systematic integration approaches for efficiently developing sophisticated neuromorphic robots.

3. Inspiring future roboticists and furthering public understanding of advanced robotics

As our project is to explore exciting potentials at the interface of two publicly appealing research areas (biped robots and neuromorphic brain), it has great potential for engaging with young scientists, from school age to early career graduates.
Our previous work on a record-breaking fast biped robot has attracted media interests (e.g., BBC news). We expect the proposed project to similarly engage public interest given the public's appetite for humanoid/biped robots. Such media coverage would provide an opportunity to reveal to a wide public audience how biologically inspired research is undertaken in robotics.
 
Description Our study was exploring a biologically plausible solution to the unclear infant bipedal motion development from standing-up capability to general biped behaviours like stance balance, walking, and running; this developmental process can be casted as collective learning of body dynamics, body-environment interaction, motion control, and motion planning, while the current scientific and engineering studies consider those aspects disjointedly. The classic and industrial robotics approaches model system dynamics primarily based on which optimised motion control, motion planning algorithms are engineered; biologically inspired robotics approaches have explored a number of model-less methods including model-free motion driven by selected patterns and reduced-order control and planning using template models; the artificial intelligence area have demonstrated that learning-based approaches can find desired control policies; the approaches include mimicking labeled human motions and pushing up the probability of desired actions according to specific rewards.

Towards a ground-up learning-based solution to the infant-like biped motion development and with the three addressed concerns above, we proposed and developed a new motion intelligence framework, sense representation. Mathematically, the proposed sense representation is a topologically specified collection of identity mappings from measured, estimated, or generalised states to perceived states. The template form of sense representations is a single central consciousness core surrounded by orbital senses; both the former and the latter is a type of senses which are the building blocks. The mechanism of the proposed sense representation can be interpreted as: the central consciousness core orchestrates orbital senses and the tensor flows between associated senses are synchronised. Except that the central consciousness core only has a single kernel, a general sense is designed to consist of a single kernel as well as multiple shells replicated from it. The kernel comprises three densely associated identity mappings, orbital consciousness, value, and context. The context region is domain-specific, which contains a sequence of selected state-action pairs; the value region is used to indicate the controllability and observability of the dynamics of the attached context; transferred from the value region, the orbital consciousness region becomes domain-independent and is designed to further monitor and report the meta status of the attached context. The shells are topologically flexible to manipulate. At runtime, they can be placed in parallel as a candidate pool for decision making or in sequence for reasoning. For a sense, its orbital consciousness region is in a private loop with its central consciousness core; the collection of these individual circulations and specified associations among orbital senses structure the topology of the sense representation. The proposed setting from the sense representation framework covers the three principal dimensions of consciousness: unconscious computation, global information broadcasting, and self-monitoring.

Based on the proposed framework, our biped agent is designed to have five levels of domain-specific senses ranging from detailed physical dynamics to generalised motion template, and its motion learning is casted into representation learning where the biped dynamics model, motion control, and motion planning are learned jointly and perceptually. The biped agent will be able to stand up, balance, walk, and run, when its topmost sense achieves sufficient controllability and observability.

We developed a toolkit to implement the sense representation framework. The toolkit is forked from PyTorch; on the basis of its C++ backend of tensors and neural network functions, we developed new timed graphic modules for the proposed sense and sense representation which are defined as subgraph and graph respectively. Especially a new engine for dispatching node tasks is designed to ensure each computational flow from the forward tensors and backward gradients scheduled according to its perceived time. Currently, the low-level senses can be learned with using the toolkit. We encounter an issue when the biped agent starts to learn a higher level sense. The main problem so far lies in the timed association between different senses; we are trying to fix it.
Exploitation Route This finding may help researchers to develop better controllers for biped robots.
Sectors Digital/Communication/Information Technologies (including Software),Electronics