A Tongue Movement Command and Control System Based on Aural Flow Monitoring

Lead Research Organisation: University of Bristol
Department Name: Mechanical Engineering

Abstract

Although there is a well-recognized need in society for effective tools which will enable the physically impaired to be more independent and productive, existing technology still fails to meet many of their needs. In particular, nearly all mechanisms designed for human-control of peripheral devices require users to generate the input signal through bodily movements, most often with their hands, arms, legs, or feet. Such devices clearly exclude individuals with limited appendage control. Spinal cord injuries, repetitive strain injuries, severe arthritis, loss of motion due to stroke, and central nervous system (CNS) disorders all represent examples of these impairments. Past work has attempted to address this need through recognition of the potential of the oral cavity (victims of stroke, spinal damage, and arthritis can often move their tongue and mouth) for control input. Developed mechanisms include: track balls, small joysticks, and retainers inserted in the mouth to be manipulated by the tongue, or sip and puff tubes which respond to concentrated exhaling and inhaling. Despite their numerous successful implementations, these devices can be difficult to operate, problematic if they fall from or irritate the mouth, may impair verbal communication, and present hygiene issues since they must be inserted within the mouth.The objective of this programme is to surmount these issues through the development of a unique tongue-movement communication and control strategy in a stand-alone device that can be calibrated for patient-use with all manner of common household devices and tailored to control assistive mechanisms. The strategy is based on detecting specific tongue motions by monitoring air pressure in the human outer ear, and subsequently providing control instructions corresponding to that tongue movement for peripheral control. Our research has shown various movements within the oral cavity create unique, traceable pressure changes in the human ear, which can be measured with a simple sensor (e.g. a microphone) and analyzed to produce commands, that can in turn be used to control mechanical devices or other peripherals. Results from this programme will enable patients with quadriplegia, arthritis, limited movement due to stroke, or other conditions causing limited or painful hand/arm movement to interface with their environment and control all manner of equipment for increased independence and quality of life.

Publications

10 25 50
 
Description The goal of the project was to demonstrate a new concept for human-machine interface in assistive device control. While extensive research has been performed in human-machine interface (HMI) for command and control, to date, nearly all interfaces are limited in their utility outside controlled environments due to the need for operator motion, lack of portability, cumbersome interface with other approaches, and equipment expense. In this program, we present the first system we are aware of that addresses all these issues. The system is capable of tracking tongue movement to indicate operator desire by monitoring airflow in the ear canal, thus no external operator movement is required. The system is unobtrusive, trivial to don, and leaves the patient free to execute any other activity while wearing/using the system. The only sensor necessary is a simple microphone and earpiece housing which is small enough and comfortable enough to be worn in the ear indefinitely. To our knowledge, our research team is the only group that has investigated the aural cavity as a monitoring venue for machine interface, and has proposed the only system whereby tongue movement may be tracked without insertion of any device in the oral cavity. We have observed tongue movement to be faster, quieter, and (in most cases) more intuitive to the user for direct device motion control compared to speech.

We have further investigated the synergy of the system with existing neural interfaces, with specific focus on brain implants and surface brain electrodes which could be combined with the tongue motion system to form a in comprehensive interface, and expanded our research results to address pattern classification of movement with respect to a range of biosignals. In summary, new contributions in this program include:



1. A new sensor for monitoring airflow in the aural cavity

2. Implementation of signal capture and recognition algorithms to accurately identify and classify tongue movement through monitoring of airflow in the aural cavity

3. A strategy for detecting and classifying simple and compound tongue movements based on airflow in the aural cavity for robust hands-free HMI

4. New multi-channel pattern classification algorithms to accurately identify physiological signals and correlate them with user intent; the implementation of these algorithms on both tongue movement and neural signals

5. A novel signal capture and segmentation system to identify physiological signals and separate them from other interfering signals in the body; the implementation of these signals for real-time extraction, segmentation, and disturbance rejection of tongue movement ear pressure signals and off-line classification of neural signals

6. The first ever published real-time results of a tongue human-machine interface with no insertion of any device in the oral cavity. The system has been demonstrated for real-time control of a robotic (prosthetic) assist arm, and in simulation on a power assist wheelchair.

7. The development of a software system including a graphic user interface (GUI) that a user may use to calibrate the system to recognize their unique tongue signals

Research output and impact for the programme has been prolific. This research has produced 8 refereed journal publications, with another 2 currently in preparation, and more than 20 refereed conference publications. Public impact for this work includes major media coverage in magazines including New Scientist, the Association of Computing Machinery (ACM), and The Engineer, and being featured on a 2010 BBC West television episode of "Inside Out". Videos associated with BBC Inside Out and New Scientist featured the first ever tongue controlled prosthetic hand.
Exploitation Route Commercial applications include prosthetic limbs and rehabilitation/assist equipment, including interfaces for power wheelchair control. Although the device is not universally applicable for any situation (e.g. when force feedback is required) we believe it represents a very significant contribution to assistive technology. We believe this work will lay the foundation for a new generation of hands-free human machine interface systems.



At this time, the company Think-A-Move has been given the results and is consolidating them for use with a new tongue-controlled wheelchair. The prosthetic company RSL Stepper has also given in-kind contributions to allow the system to be demonstrated on their Bebionic prosthetic hand. Possible venues of commercialization in this regard are also being considered.
Future work could involve synergizing tongue movement and neural modes of interface to develop a cohesive, robust human/robot interface. A foundation for two distinct modes of operation is envisioned whereby several devices (e.g. a power wheelchair, prosthetic limbs, household appliances, stationary mechanical assist devices, etc.) may all be directed given the possibilities for control input.
Sectors Electronics,Healthcare

URL http://faculty.nps.edu/ravi/HMI/Bristol%20Tongue%20Control%20Video.avi
 
Description Findings from this program created the first ever tongue based machine interface system that required no oral insertion. It led to the awards: "A Teleoperative Sensory-Motor Control System" (Imperial College-NUS Singapore), "Integrated motion, muscle, and neural activity monitoring" (EPSRC kick-start internal award), and "Multimodal Hands'-Free Interfaces for Robotic Teleoperation" (US Army to Think-A-Move (TAM), Ltd for commercialization; reference: W56HZV-05-C-103) worth £800,000 collectively. It spawned new products in human-robot interface being currently adopted by the US military. Industrial adoption also resulted in the first tongue controlled wheelchair with no oral obstruction. The work was also was also highlighted in a keynote video for the 'New Technology Foundation Award' recognizing highest innovation at the 20 year anniversary IEEE IROS conference. The Engineer (2008) and New Scientist (2010) featured the research; a New Scientist video showing the first tongue controlled prosthetic hand received 6,000+ views its first week.
First Year Of Impact 2009
Sector Aerospace, Defence and Marine,Healthcare
Impact Types Societal,Economic

 
Description Dyson Foundation Research
Amount £160,000 (GBP)
Organisation Dyson Foundation 
Sector Charity/Non Profit
Country United States
Start 10/2011 
End 10/2016
 
Description UK Engineering and Physical Sciences Research Council (EPSRC), Kick Start Grant
Amount £20,000 (GBP)
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 04/2012 
End 04/2013
 
Description US Office of Naval Research NICOP Program
Amount $375,000 (USD)
Organisation US Navy 
Department US Office of Naval Research Global
Sector Academic/University
Country United States
Start 07/2014 
End 12/2017
 
Description Think-A-Move 
Organisation Think-A-Move, LLC
Country United States 
Sector Private 
PI Contribution Development of algorithms and mathematical architectures supporting commercialization of speech and tongue based aural interface.
Collaborator Contribution Hardware design and earpiece fabrication for speech and tongue based aural interface
Impact Production of the first ever tongue controlled wheelchair that did not require any oral cavity devices Production of the human-robot interface that did not require any gesture control or oral interface that has been written into the next-generation US Army specifications for control of unmanned vehicles
 
Title Oral Controlled Wheelchair 
Description The company Think-a-Move, Ltd, USA created a new wheelchair control system based on voice and tongue movement drawing from our academic research. 
Type Therapeutic Intervention - Medical Devices
Current Stage Of Development Late clinical evaluation
Year Development Stage Completed 2011
Development Status On hold
Impact This system was the first ever wheelchair control system that did not require oral cavity insertion or bodily movement. 
 
Description BBC Television Programme 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Media (as a channel to the public)
Results and Impact The research was featured on the television program ""Assisted Living", as a part of "BBC Inside Out". A demonstration of the first ever tongue controlled prosthetic hand was shown on the television show, which reached an audience in the thousands. Broadcast by BBC West, broadcast Nov 8, 2010

Members of the general public expressed interest in visiting my lab and reported an increased awareness of assistive technology.
Year(s) Of Engagement Activity 2010
 
Description Public Press Resulting from Project 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The New Scientist article, "Tongue clicks to control wheelchairs" was written about this research highlighting its assistive potential for prosthetic limb control research and wheelchair control was published on Dec 4, 2010 (pg 24 (link)

A youtube video posted by New Scientist about the article citing this research had over 6,000 views within a week of posting on the New Scientist website.
Year(s) Of Engagement Activity 2010
URL http://www.newscientist.com/article/dn19790
 
Description Public Press Resulting from Project 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact The New Scientist article, "Tongue clicks to control wheelchairs" was written about this research highlighting its assistive potential for prosthetic limb control research and wheelchair control was published on Dec 4, 2010 (pg 24 (link)

A youtube video posted by New Scientist about the article citing this research had over 6,000 views within a week of posting on the New Scientist website.
Year(s) Of Engagement Activity 2010
URL http://www.newscientist.com/article/dn19790
 
Description Royal Society Summer Science Showcase 2009 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? Yes
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact A booth showing parts of the system at a 'Biologically Inspired Systems' display at the 2009 Royal Society Summer Science Exhibition was constructed and demonstrated. Over 1000 people attending the event came to the booth.

Members of the lay public and even the royal family observed the display, which resulted in very wide dissemination.
Year(s) Of Engagement Activity 2009