Toward The Next Generation Of Brain-Computer Interfaces

Lead Research Organisation: University of Sheffield
Department Name: Automatic Control and Systems Eng

Abstract

A brain-computer interface (BCI) provides a direct communication pathway between a human brain and an external device. Using appropriate sensors and data processing algorithms, a BCI maps patterns of brain activity associated with a volitional thought onto signals suitable for communication and control. In most of BCI systems, brain signals are measured by electroencephalogram (EEG), due to its non-invasiveness, relatively low cost and high temporal resolution. The BCI technology holds great promise as a basis for assisting people with severe communication and motor disabilities. Moreover, it can be applied for nonmedical applications such as gaming. Despite the impressive expansion in the recent years, none of the BCI systems described in the literature are sufficiently mature for the daily use out of the laboratory. In order to make BCI the next generation of intuitive interfaces, improvements are required in several areas, such as usability, signal acquisition techniques, hardware development, machine learning and signal processing, and system integration.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/N509735/1 01/10/2016 30/09/2021
1931447 Studentship EP/N509735/1 25/09/2017 24/03/2021 Christopher Wirth
 
Description When most people wish to pick up an object, the conscious thought process is straightforward. We simply choose to pick up the object, reach for it, and grab it. For most Brain-Computer Interface (BCI) users, however, the conscious thought process is much more complex. Users must generally think through each low-level action, such as "move arm left... move arm left... move arm forward... open hand..." and so on. Recent studies have shown the possibility of moving to a more autonomous BCI system, by interpreting the brain's reaction to observed robotic actions, and using these human responses to teach the robot which actions to take, and which to avoid. It has been shown that we can use electrical signals from the brain, called electroencephalography (EEG), to detect when users have observed what they perceive to be an error, compared to when they observe correct actions. Recent developments have shown that such error detection can be used as feedback for machine learning, allowing robots to learn optimal paths to target locations.

In the work funded through this award, we intend to investigate whether it is possible to gather more detailed information from EEG recordings as users observe robotic tasks. If so, we intend to use this more detailed information to improve the efficiency and effectiveness of semi-autonomous BCI systems.

Our first key finding was the proof of concept that it is possible to distinguish between the brain's response to observing two different types of navigational error: (a) stepping off a target location when the robot was previously on it, and (b) moving further away from the target location (from an already off-target location). We showed that it is possible to classify the EEG responses to these two error types against each other on a single trial basis. This finding was published in the Journal of Neural Engineering (https://doi.org/10.1088/1741-2552/ab53fe).

Our second key finding was the proof of concept that it is possible to distinguish between the brain's response to observing two different types of correct navigational action: (a) moving closer to a target, but not reaching it, and (b) reaching the target. We showed that it is possible to classify the EEG responses to these two correct action types against each other on a single trial basis. This finding was published in Frontiers in Neuroscience (https://doi.org/10.3389/fnins.2020.00066).

We are currently progressing with research into both simulated and on-line application of this fine-grained information in a machine learning scenario, allowing robots to efficiently perform both navigation to, and identification of, targets.

Through the course of the work funded by this award, I have also gained substantial experience in key aspects of performing BCI research, including practical EEG data collection, and state-of-the-art methods for processing such EEG data.
Exploitation Route In the first instance, the outcomes may be taken forward by other researchers. We have provided proofs of concept that show the possibility of differentiating observed robotic actions in finer detail than has previously been applied. Other researchers may now be able to improve further on the classification accuracy that we have reported, thus making resulting systems yet more efficient.

Looking beyond the research environment, the ultimate aim of our current work - afforded by this award - is to move towards a semi-autonomous Brain-Computer Interface (BCI) that can allow users to perform a wide range of tasks simply by focusing on their high-level goal, and observing an assistive robot's actions. In future, this could be put to particular use in the healthcare and biotechnology sectors, by the creation of devices and systems to assist severely disabled users. Such systems could help users to navigate, or to control devices within their homes, without the highly demanding need to consciously control each small, low-level action of the robot or assistive device.
Sectors Digital/Communication/Information Technologies (including Software),Education,Healthcare,Manufacturing, including Industrial Biotechology,Pharmaceuticals and Medical Biotechnology

 
Title EEG data collected during observation of a virtual robot 
Description Two similar virtual robot observation tasks were performed, while participants observed the tasks, and electroencephalograph (EEG) data were recorded to capture the participants' neural responses to the actions of the robot. In task 1, a cursor moved in a 1-dimensional space, attempting to move towards a target location and identify when it had reached this destination. In task 2, the cursor was replaced by a virtual robotic claw, which moved above a series of objects, attempting to move towards and grab the target object. During task 1, 8 channels of EEG were recorded at 500Hz, using an Enobio 8 headset. During task 2, 20 channels of EEG were recorded at 500Hz, using an Enobio 20 headset. A total of 27 healthy adults observed the tasks: 10 observed task 1, and 17 observed task 2. All participants for both tasks had normal or corrected-to-normal vision. They reported no history of psychiatric illness, head injury, or photosensitive epilepsy. Written informed consent was provided before testing began. All procedures for both tasks were in accordance with the Declaration of Helsinki, and were approved by the University of Sheffield Ethics Committee in the Automatic Control and Systems Engineering Department. We are considering making these data publicly available either through a resource such as BNCI Horizon 2020 (http://bnci-horizon-2020.eu/database/data-sets) or through a relevant special call (e.g. https://www.frontiersin.org/research-topics/9784/datasets-for-brain-computer-interface-applications). 
Type Of Material Database/Collection of data 
Year Produced 2019 
Provided To Others? No  
Impact The data have resulted in two publications: Journal article 1: https://doi.org/10.1088/1741-2552/ab53fe Journal article 2: https://doi.org/10.3389/fnins.2020.00066 
 
Description Trinity College Dublin collaboration 
Organisation Trinity College Dublin
Department Institute of Neuroscience
Country Ireland 
Sector Hospitals 
PI Contribution I processed and performed single-trial analysis on data which were collected and supplied by the collaborators. This involved the intellectual input of both myself and my PhD supervisor, regarding the research questions that were proposed. Further, it involved my machine learning expertise in the analysis of the data.
Collaborator Contribution Collaborators based at Trinity College Dublin collected and supplied data to help in the investigation of one of my research questions. Furthermore, they applied their significant expertise in neuroscience, making a major contribution to the neurophysiological analysis of the data.
Impact Two publications occurred as a direct result of this collaboration: Conference paper: https://doi.org/10.1109/EMBC.2018.8512700 Journal article: https://doi.org/10.1088/1741-2552/ab53fe
Start Year 2017
 
Description Hamlyn Symposium Presentation 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact I gave both an oral presentation and a poster presentation, regarding work related to this award, at the Hamlyn Symposium on Medical Robotics. The presentations were as a part of a specific workshop entitled, "From BCI to Human Robot Augmentation". As a result of this, I was able to meet a highly respected researcher in the field, whose work has influenced my own. We were able to discuss our existing work, and potential future directions.
Year(s) Of Engagement Activity 2019
 
Description IEEE Young Professionals Symposium 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact I gave a poster presentation as part of the 2nd IEEE Young Professionals Symposium. The event was a useful way to see current developments from young engineering researchers, and an opportunity to showcase my own research in BCI to an interested and engaged audiance. A number of fellow participants asked questions regarding my research as an alternative approach to more commonly used BCI control strategies.
Year(s) Of Engagement Activity 2019
 
Description Interview and demonstration for national news 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact A Sky News reporter, Tom Parmenter, visited our lab. I demonstrated a Brain-Computer Interface (BCI) experiment that I have been carrying out as a part of this award. In addition to this, I was interviewed regarding the range of potential applications of BCI technology. The piece aired numerous times throughout the day on 10th September 2019, in line with the launch of The Royal Society's report on neural interfaces. Following this, I received an increase in general interest from members of the public regarding the future scope of BCI. In addition to this, the department received enquiries from school students regarding how to pursue BCI research as a career path.
Year(s) Of Engagement Activity 2019