Integration of multisensory information for skilled grasp

Lead Research Organisation: University College London
Department Name: Institute of Neurology

Abstract

I have always been fascinated by the human hand, the principal organ through which we interact with our environment. If we lose the ability to use our hands due to illness or trauma, our physical interactions with the environment are critically jeopardised. In our daily life, we use our hands to grasp and manipulate hundreds of objects and tools with an apparently effortless grace. For example, if I see a pen on my desk I can reach out, grasp it and use it. The sight of the pen tells me not only how far to reach but it also evokes memories of how heavy the object is likely to be and where its mass is distributed. This means that by the time my hand has reached the pen, it is in the correct orientation and position to pick it up and prepare to write. As soon as I touch the pen, sensations from the hand tell me whether my estimates of its heaviness or size were correct, and if not, I use the newly acquired information to update my movement. Vision provides accurate information about size and orientation; in contrast, the hand can effectively 'see' around corners and can feel textures so that we know how slippery the surface is. Therefore to permit skilled grasp, the brain needs to extract information from multiple sensory sources, and principally, vision and touch.
Previously, I studied how the brain transfers visual information about an object to the motor system. However, this is only half of the picture and I believe it is now crucial to understand how the brain integrates other sensory modalities in planning our interaction with the environment. Information from touch only becomes available once we contact an object. Therefore, to plan a grasping action, the brain needs to use stored memories of an object's tactile properties. After contact with the object is made, the brain can use fresh tactile cues to consolidate this memory and update the grasp, if required. This raises the possibility that there might be two different mechanisms by which relevant visual and tactile information are processed. Visual information would first be integrated with tactile memory. Then, after object contact, vision would be processed together with current tactile cues. This project is about how and where in the brain vision and touch are combined to control grasp. How does the brain switch from dominant visual control during reach to combined vision/touch control as soon as we contact the object? These questions are important for a number of reasons. Improving tactile feedback from instruments and tools can improve dexterity; this can be important for professionals such as surgeons and technicians. A better knowledge of how vision and touch are integrated will also allow us to develop machines that can replicate human grasp and directly benefit paralysed patients. Finally, some neurological diseases (such as stroke and peripheral neuropathies) affect the hand sensorimotor control, impairing the ability to use sensory inputs to control grasp: understanding how such control integrates these inputs may help us to devise better strategies to improve their quality of life.
I plan to study grasping movements in a virtual reality environment in which I can control vision of an object separately from touch. I will test how each factor influences the other during different phases of a grasping action. These experiments will reveal how changing the balance between vision and touch affects grasping behaviour. To understand how this is reflected in the brain, I will use transcranial magnetic stimulation, a method of painlessly stimulating the brain directly through the intact scalp in conjunction with brain scanning methods that reveal patterns of activity in the brain. This will allow investigating how areas of the cerebral cortex involved in processing vision and touch interact and transfer information to the motor cortex. These approaches should illuminate in detail the brain mechanisms involved in the sensorimotor integration of vision and touch.

Technical Summary

Skilled grasp is a sensorimotor function requiring the brain to extract sensory information primarily from vision and touch. However, because tactile cues are not available before object contact, the grasp planning can only be controlled by information from vision, and from a tactile memory of the object. After object contact, tactile cues become available and must be integrated with visual cues for online grasp control. To date, it is still unknown how sensory working memories of objects and online tactile inputs are integrated with vision, and how that information is transferred to motor areas. The question is important since successful combination of vision and touch is critical for skilled hand actions, such as the use of a surgical instrument. A better understanding of grasp function could be important in designing enhanced hand prostheses and brain-machine interfaces to deliver an improved quality of life for patients with paralysed upper limbs.
In this project, I will investigate the motor performance of human grasp using a virtual reality environment to control independently visual and haptic inputs about an object. Firstly, I will explore how motor control parameters are affected by visual and haptic cues. Secondly, I will study which brain areas have a causal role in integrating vision and touch. Combining transcranial magnetic stimulation (TMS) with functional brain imaging will allow me to reveal which brain areas exert causal influences on the motor system depending on the presence of visual or tactile cues. This will provide me with a precise spatial topography of changes in brain activation related to TMS. Finally, I will investigate changes in physiological interactions within the cortical grasping circuit. I will use an original TMS technique that I pioneered to test how the motor system channels vision and touch during movement planning and execution. This approach will also shed some light on the time course of changes in the motor system.

Planned Impact

My research has an important potential impact on society. Hand function is fundamental to human existence. Yet, impairments of human hand control are relatively common. For example, the prevalence of stroke and Parkinson's disease in Western countries is around 500 and 300 per 100,000, respectively. The prevalence of other neurological motor disorders such as Essential Tremor may be even greater. The resultant cost to National Health Service (NHS) and to human quality of life is very large. Dramatic improvements in our understanding of motor control can lead to effective treatments for some patient groups. A recent survey in the United States demonstrated that quadriplegic individuals ranked regaining hand and arm function 40% higher than walking or normal sensation (Anderson, Journal of Neurotrauma, 2004).

Therefore, basic research into understanding how the brain integrates vision and touch for skilled grasp could improve the way we treat disorders affecting hand function. My virtual reality experimental setup could be simplified so that patients with hand movement disorders could bring it home to practice movements of increasing difficulty. We recently started a partnership with a company in London (Inition, www.inition.co.uk). One aim is to design a user-friendly virtual reality interactive game that can be used as a tool for evaluating the degree of severity of a patient's movement deficits and for rehabilitation by physiotherapists or at home. So far, assessing a patient's movement performance relies on old batteries of tests and this modern tool is very sensitive to unveil deficits in how patients shape their hand or scale their grip force when lifting objects. Moreover, a sustained rehabilitation at home with a tool that can give a score to the patient's performance is a way of enhancing motivation and allowing neurologists and physiotherapists to follow the patient's improvements.

In addition, a better knowledge of the sensory rules that the brain uses to integrate vision and touch could improve the design of robotic grasp and hand prostheses. So far, robots mainly use visual information to guide simple actions. By adding tactile sensing capabilities to robots, their grasp dexterity could improve. Altogether, this could results in an improved quality of life for patients with paralysed hands.

Our hands are fundamental for our interaction with the environment. In the recent years, a growing number of ways to interact with the environment requires us to grasp tools. To improve quality of life, companies generate a large effort to make tool use more efficient (compare a screwdriver from 20 years ago to a recent one). Tool shapes are now more ergonomic and textures used have higher friction to improve grasp stability. Again, understanding how an object's intrinsic properties affect hand shaping and force scaling will directly benefit companies seeking efficient design for new tools.

Finally, basic research into vision and haptics could allow incorporating haptics into precise surgical instruments. So far, surgical instruments used during remote operations do not benefit from haptic feedback. For example, this means that when the surgeon cuts a blood vessel, he does not feel the tissue resistance against the teleoperated scissors. A recent paper (Okamura, 2009) showed that the lack of haptic feedback leads to more mistakes during surgical operations.

References:
Anderson KD, Targeting recovery: priorities of the spinal cord-injured population, J Neurotrauma. 2004 Oct;21(10):1371-83.
Okamura AM, Haptic feedback in robot-assisted minimally invasive surgery, Curr Opin Urol. 2009 Jan;19(1):102-7. Review.

Publications

10 25 50
publication icon
Bunday KL (2016) Grasp-specific motor resonance is influenced by the visibility of the observed actor. in Cortex; a journal devoted to the study of the nervous system and behavior

publication icon
Davare M (2012) Cortical Connectivity

publication icon
Katschnig-Winter P (2014) Motor sequence learning and motor adaptation in primary cervical dystonia. in Journal of clinical neuroscience : official journal of the Neurosurgical Society of Australasia

publication icon
Leib R (2016) Stimulation of PPC Affects the Mapping between Motion and Force Signals for Stiffness Perception But Not Motion Control. in The Journal of neuroscience : the official journal of the Society for Neuroscience

publication icon
Palmer CE (2016) A Causal Role for Primary Motor Cortex in Perception of Observed Actions. in Journal of cognitive neuroscience

publication icon
Palmer CE (2016) Physiological and Perceptual Sensory Attenuation Have Different Underlying Neurophysiological Correlates. in The Journal of neuroscience : the official journal of the Society for Neuroscience

publication icon
Pareés I (2013) Failure of explicit movement control in patients with functional motor symptoms. in Movement disorders : official journal of the Movement Disorder Society

 
Description Every day, people use their hands effortlessly to assess an object's stiffness, like the ripeness of a piece of fruit. For the first time an international team of scientists led by University College London, have discovered the area in the brain where stiffness perception is formed. The findings, published 12 October 2016 in the Journal of Neuroscience, could aid rehabilitation in patients with sensory impairments.

Our paper: 'Stimulation of PPC affects the mapping between motion and force signals for stiffness perception but not motor control' is published 12 October 2016 in the Journal of Neuroscience, see DOI: dx.doi.org/10.1523/jneurosci.1178-16.2016
Exploitation Route In particular, virtual reality technologies are becoming ubiquitous these days. How we can sense virtual objects is a key question to improve technology.
Sectors Digital/Communication/Information Technologies (including Software),Electronics,Healthcare,Pharmaceuticals and Medical Biotechnology

URL http://www.bbsrc.ac.uk/news/health/2016/161012-pr-scientists-discover-how-we-sense-stiffness/?colour=yellow&issue=Winter%202017
 
Description One of our latest paper published in the Journal of Neuroscience on the 12th October 2016, made the cover of this journal. As a result, our funders (BBSRC) interviewed us in our laboratory and this resulted in a video posted on YouTube, visible by the general public. Link: https://www.youtube.com/watch?v=1k9cZ8QNuxU
Sector Digital/Communication/Information Technologies (including Software),Electronics,Healthcare,Leisure Activities, including Sports, Recreation and Tourism,Manufacturing, including Industrial Biotechology,Pharmaceuticals and Medical Biotechnology
Impact Types Cultural,Societal

 
Description Royal Society International Exchange scheme
Amount £12,000 (GBP)
Funding ID IE130980 
Organisation The Royal Society 
Sector Charity/Non Profit
Country United Kingdom
Start 01/2014 
End 12/2016
 
Title Grip-lift synergy manipulandum 
Description Manipulandum mounted with 2 3D force-torque sensors capable of measuring fingertip forces applied when grasping and lifting objects 
Type Of Material Physiological assessment or outcome measure 
Year Produced 2013 
Provided To Others? Yes  
Impact Use of the grip-lift synergy as diagnostic tool in patients with peripheral neuropathies. 
 
Title Virtual reality environment for hand movements 
Description Virtual reality experimental setup in which an object's 3-D image is generated by a computer screen and projected on a table via a semi-reflecting mirror. A vivid 3-D object is simulated by 2 force feedback robot-controlled haptic devices (Phantom robots, Sensable technologies) attached to the thumb and index fingertips. This will allow subjects to experience visual and haptic feedback of while manipulating an object. 
Type Of Material Physiological assessment or outcome measure 
Provided To Others? No  
Impact Research is ongoing 
 
Description Prof. M. Santello 
Organisation Arizona State University
Department School of Biological and Health Systems Engineering
Country United States 
Sector Academic/University 
PI Contribution I provided knowledge about the use of non-invasive brain stimulation techniques during performance of skilled hand movements.
Collaborator Contribution They provided expertise in using force sensors to study dexterous object manipulation in humans.
Impact We published a first paper so far: Corticospinal excitability underlying digit force planning for grasping in humans Journal of Neurophysiology 2014 | journal-article DOI: 10.1152/jn.00815.2013
Start Year 2012
 
Description Prof. R. Flanagan 
Organisation Queen's University
Country Canada 
Sector Academic/University 
PI Contribution I provided knowledge about the use of non-invasive brain stimulation.
Collaborator Contribution They provided knowledge about functional brain imaging techniques to locate targets for transcranial magnetic stimulation.
Impact Royal Society international exchange grant
Start Year 2013
 
Description Talk at UK sensors-motor conference 2016 (Newcastle, UK) 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact Presentation of early results showing new connectivity pathways between the brain and spinal cord.
Year(s) Of Engagement Activity 2016
 
Description YouTube video 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Our lab was interviewed following us making the cover of the Journal of Neuroscience on the 12th Oct 2016.
We described the experiment and the main results.
Year(s) Of Engagement Activity 2016
URL https://www.youtube.com/watch?v=1k9cZ8QNuxU