HUMAN-Touch: Physical and Neurocognitive AI models of Ultrasound Haptics
Lead Research Organisation:
Ultraleap
Department Name: Research and development
Abstract
When we touch a physical object, a sequence of mechanical events occurs whereby vibration is transmitted via the hard and soft tissues of the hand. The signals generated during object manipulation are then transduced into neural signals via ascending sensory pathways that our brain interprets as touch. When combined with signals from our other senses, memories and expectations, this information forms our realisation of the physical and psychological worlds. With modern technology, it is possible to generate immersive environments with breath-taking graphics, yet touch technologies (also known as haptics) capable of realistically and unobtrusively emulating the sense of touch have only just began to emerge.
This future leaders fellowship (FLF) aims to unlock new potential in non-contact touch technologies by holistically understand both the physical and psychophysical dimensions of ultrasound mid-air haptics. To that end, we will lead ground-breaking R&D across acoustics, biophysics, neuroscience and artificial intelligence (AI).
Mid-air haptics refers to electronically controlled collections of ultrasound speakers (phased arrays) that collectively generate complex acoustic fields in 3D space that can be touched and felt with our bare hands. Holographic 3D objects and surfaces can therefore be "haptified" and interacted with in mid-air, without the need to wear or hold any specialised controllers; a feature particularly appreciated in public display interfaces to limit the spread of pathogens. Coupled with augmented and virtual reality solutions, the technology allows the design and remote collaboration scenarios that are often seen in Sci-Fi movies such as Iron Man and Minority Report.
R&D in mid-air haptics has been accelerating in recent years, yet has almost exclusively focused on hardware advancements, acoustic signal processing, and human-computer interaction (HCI) use cases. We believe that the true potential of ultrasound mid-air haptics is still unexplored, an opportunity uniquely available to be exploited by this FLF. Current mid-air haptics displays, such as those commercialised by Ultraleap only target one type of touch receptors (mechanoreceptors), which limits the device expressivity. Biophysical models capturing how acoustic waves interact with the skin are at their infancy and are experimentally unverified. Generative and computational models connecting phased array output, acoustic focusing waves, skin vibrations, mechanoreceptors, and psychophysical experiences are absent. This fellowship will be the first to thread these together. We will study ultrasonic mid-air haptics from first principles (i.e., acoustics and biophysics) all the way to perception and neurocognition. We will understand how localised acoustic energy generates non-localised skin vibrations, how those vibrations activate different touch receptors in the skin, and how receptors encode information that our somatosensory system then understands as touch. Once the forward problem is pieced together, our aim is to use machine learning to construct generative AI models enabling us to solve the inverse problem. What input ultrasound signals should be used to create the tactile sensation of holding a high-quality piece of paper? Today, there is no scientific way of answering such a question, even if we know that something like this is possible. Being able to bridge the different scientific fields related to ultrasonic mid-air haptics to create a holistic understanding of holographic touch is uniquely enabled by this FLF application.
This 4-year, full-time, reduced hours FLF will support a cross-disciplinary and agile team of 2 postdoctoral research associates (RAs) led by the fellow, while being hosted at the only company in the world that is commercialising mid-air haptics, thus providing the fellowship with access to unique resources, engineering insights, and a direct pathway to economic and societal impact.
This future leaders fellowship (FLF) aims to unlock new potential in non-contact touch technologies by holistically understand both the physical and psychophysical dimensions of ultrasound mid-air haptics. To that end, we will lead ground-breaking R&D across acoustics, biophysics, neuroscience and artificial intelligence (AI).
Mid-air haptics refers to electronically controlled collections of ultrasound speakers (phased arrays) that collectively generate complex acoustic fields in 3D space that can be touched and felt with our bare hands. Holographic 3D objects and surfaces can therefore be "haptified" and interacted with in mid-air, without the need to wear or hold any specialised controllers; a feature particularly appreciated in public display interfaces to limit the spread of pathogens. Coupled with augmented and virtual reality solutions, the technology allows the design and remote collaboration scenarios that are often seen in Sci-Fi movies such as Iron Man and Minority Report.
R&D in mid-air haptics has been accelerating in recent years, yet has almost exclusively focused on hardware advancements, acoustic signal processing, and human-computer interaction (HCI) use cases. We believe that the true potential of ultrasound mid-air haptics is still unexplored, an opportunity uniquely available to be exploited by this FLF. Current mid-air haptics displays, such as those commercialised by Ultraleap only target one type of touch receptors (mechanoreceptors), which limits the device expressivity. Biophysical models capturing how acoustic waves interact with the skin are at their infancy and are experimentally unverified. Generative and computational models connecting phased array output, acoustic focusing waves, skin vibrations, mechanoreceptors, and psychophysical experiences are absent. This fellowship will be the first to thread these together. We will study ultrasonic mid-air haptics from first principles (i.e., acoustics and biophysics) all the way to perception and neurocognition. We will understand how localised acoustic energy generates non-localised skin vibrations, how those vibrations activate different touch receptors in the skin, and how receptors encode information that our somatosensory system then understands as touch. Once the forward problem is pieced together, our aim is to use machine learning to construct generative AI models enabling us to solve the inverse problem. What input ultrasound signals should be used to create the tactile sensation of holding a high-quality piece of paper? Today, there is no scientific way of answering such a question, even if we know that something like this is possible. Being able to bridge the different scientific fields related to ultrasonic mid-air haptics to create a holistic understanding of holographic touch is uniquely enabled by this FLF application.
This 4-year, full-time, reduced hours FLF will support a cross-disciplinary and agile team of 2 postdoctoral research associates (RAs) led by the fellow, while being hosted at the only company in the world that is commercialising mid-air haptics, thus providing the fellowship with access to unique resources, engineering insights, and a direct pathway to economic and societal impact.
Organisations
People |
ORCID iD |
William Frier (Principal Investigator / Fellow) |
Description | Ultraleap (the host) unique mid-air haptic technology rely on focused ultrasound to induce tactile sensation to the user without holding or wearing any additional devices. Through this grant, our team discover that the strength of mid-air haptic devices could be predicted by more than acoustic pressure. |
Exploitation Route | We are preparing an article explaining further this discovery for publication in a scientific article. |
Sectors | Digital/Communication/Information Technologies (including Software) |
Description | Early results of our findings allowed to reshape the requirements for Ultraleap (the business-host) research and development strategy. |
First Year Of Impact | 2023 |
Sector | Digital/Communication/Information Technologies (including Software) |