Monday 16 July 2018



Vincent Hayward, Leverhulme Visiting Professor at the Institute of Philosophy (IP), School of Advanced Study, has won the best hands-on demonstration award at the 2018 EuroHaptics international conference for a device that that will help deafblind people communicate remotely.

Deafblind individuals use a system of communication that involves rapid hand and finger movements to touch the other’s fingers and hand. This highly effective form of communication requires individuals to be physically in touch with one another. However, this new invention will enable people to communicate at a distance.

The winning device ‘ HaptiComm’ uses specifically designed tactile transducers to provide sensations that closely resemble those elicited by real fingers tapping and sliding on the skin of the palm and fingers,’ explains Professor Hayward. ‘These sensations, which must be felt with speed and clarity, are the fundamental elements on which tactile communication is built. They differ dramatically from the diffuse buzzing sensations given by vibrations motors found in portable devices, or from the jolts felt in game pads.’

Working with IP research fellow Sven Topp from Australia, and PhD candidate Basil Duvernoy, Professor Hayward has developed an affordable technology capable of reproducing the formal deafblind communication method, which can be learned quickly.

HaptiComm offers direct speech to Haptic language translation at speeds of up to 12–14 actuations per second. This means that a person can speak normally into a machine at one location and the communication is converted into a sequence of haptic codes that can be sent across the internet. At the other end, a deafblind person just puts his or her hand on to the plastic frame/shell of the HaptiComm device (see picture below and video) and a series of small rods move up through the frame to provide the tactile feedback that spells out letters and words to the user. 


The haptic language is identical to that which Sven Topp, who is deafblind, uses with his interpreters converting speech and text into tactile stimuli. He says, 'The overall design and principles of HaptiComm allow for a large range of design implementation, input and tactile domain output that provides a remarkable breadth in its applications as a communication platform.'

Deafblindness dramatically affects the communication capacity of around 100,000 Australians and 350,000 people in the UK. These figures are expected to rise as the population ages. Lack of visual and auditory communication channels can prevent deafblind individuals from enjoying meaningful interactions with people and their environment. This can lead to depression and a plethora of other mental health issues.

Besides, the rapid and extended use of hand and finger movements require sustained attention, and can cause physical and mental stress for both interpreters and deafblind individuals.

To meet the extensive requirements of the deafblind sector HaptiComm had to accomplish a number of goals: it had to be inexpensive to produce and maintain, be flexible in design and programmable to meet varied personalised needs.

The device debuted at Germany’s 2017 World Haptics Conference, where it was placed in the top four finalists. The demonstration of an evolved version at EuroHaptics 2018 held in Italy saw it win the ‘best hands-on demonstration’ category with a system that exceeded many of its original goals.

This unique communication development platform offers high levels of flexibility and versatility for the Haptics research community, and was well received by deafblind participants at the 2018 Helen Keller World Conference in Spain.

The HaptiComm project (Tactile communication technology for use by the deafblind sponsored by Google LLC, Sorbonne Université, and the Institute of Philosophy) introduces a paradigm shift in haptic adaptive technology. Rather than approaching the deafblind community with ‘Here’s a device that will create a code you can learn to understand speech’, Professor Hayward and his team proceeded with, ‘Here’s a device you can program/modify to suit your personal needs and communicate in your own natural language.’

Its unique approach takes advantage of tactile communication techniques already used in the deafblind community. HaptiComm’s stimulation system for producing nonvisual image patterns on the user’s skin can also produce for example, vibratory Braille, LORM (Germany) and Molassi (Italy) alphabets and runs through Google voice assistant. It will reduce the learning curve and transform possibilities for communication at distance for the deafblind community, worldwide.


Notes for Editors:

For further information, please contact: Maureen McTaggart, Media and Public Relations Officer, School of Advanced Study, University of London +44 (0)20 7862 8653  /