Hearing superpowers!

SCTODAY

Did you know that vision has a calibrating role in locating events in space? But then how do blind people locate sound around them? The answer may surprise you...

Vision’s predominant role in the perception of space

We have all seen a ventriloquist perform. The ventriloquist uses their throat and rib cage to make sound while moving the puppet’s mouth, from which the voice seems to come. Several laboratory experiments have confirmed this dominant role of vision. They consisted of emitting a sound (audio) and a flash (visual) at two different places in a room, and then asking the participants where the audiovisual event came from. Participants systematically pointed to the flash. ‘On the basis of this kind of work, a theory was born that vision is so important for locating events in space that it has a calibrating role for the other senses,’ explains Olivier Collignon, a neuropsychologist and supervisor of a PhD study carried out by Ceren Battal and published in Psychological Science. ‘It’s the visual predominance over the other senses that explains this phenomenon. We generally recalibrate the spatial origin of a sound with its visual origin because vision predominates in the perception of space.’

What about blind people?

Prof. Olivier Collignon has always been interested in the development of certain cognitive and perceptive skills in persons deprived of a sense. In the context of this theory, he asked himself: What about blind people? If vision is so necessary and important for developing the sense of localisation of events in space, are blind people deprived of this calibrator? Do they not have access to this visual guide that explains to them where sound comes from? If this theory of visual calibration is true, would those born blind or blinded early in life be simply unable to locate sounds? This is what PhD student [LG1] Ceren Battal has tried to observe.

Locate a sound 

Ms Battal started from this hypothesis for her behavioural study. She presented 16 totally blind persons and 16 sighted persons with two sounds between which she varied the distance. Participants were then asked to say whether the second sound was more to the right or to the left, above or below the first. She relied on a system of 64 loudspeakers that could stimulate the sound environment quite extensively, then analysed the accuracy of participant responses while gradually decreasing the distance between the sounds.

Astonishing conclusion

What Ms Battal observed defies all predictions: ‘Against all odds,’ Prof. Collignon explains, ‘the blind were better at this exercise than the sighted. At a given moment, the sounds are so close together that it becomes impossible to tell if the second is more to the left or right or above or below. But this minimum distance is shorter in blind persons than in sighted persons.’ Not only do those who have been deprived of sight since birth have no problem locating sounds in space, they do so more precisely than sighted people. In the absence of vision, hearing skills therefore seem to be more refined, as blind people have to compensate for the absence of vision by sharpening the remaining senses.

Really so innovative?

For Prof. Collignon, this discovery puts an end to previous debates. For isn’t it common to say that people deprived of certain senses overdevelop the other senses? ‘That’s true,’ he replies, ‘but as far as spatial location is concerned, the debate remained vigorous because the dominant role of vision in the perception of space suggested that in its absence, blind people could have problems. It is commonly said that blind people have absolute hearing[LG2] , for example. But this kind of theory only concerns experiments where vision doesn’t play a calibrating role. In the perception of space, several articles prior to our research suggested that blind people were deficient in spatial localisation including sound.’

And what happens in the brain?

After observing at the behavioural level how blind people locate sounds, another study is already underway, this time in neuroimaging. What happens in the brain of blind persons during this experiment? ‘One of the main hypotheses,’ Prof. Collignon replies, ‘would be that the regions that normally process visual space (certain regions of the occipital cortex located at the back of the brain) are also activated in the blind for all auditory spatial processing, in addition to the auditory cortex.’ But this is a hypothesis that will only be confirmed or disproved in a forthcoming article.

 
A tablet that reacts to touch

On 1 October, two UCLouvain PhD students[LG1] got off to a flying start in the Multitouch project. One of them is supervised by Olivier Collignon who tells us more about this project.

In what framework was this project born?

This is a European project which is part of the Actions Marie Sklodowska-Curie (AMSC). One of these actions is called Innovative Training Network (ITN) and consists of funding PhD scholarships on a particular topic through European universities. Together, European PhD students will study a multidisciplinary topic and provide innovative solutions.

What are UCLouvain’s specialities for this project?

At UCLouvain, two PhD students have been selected for this project: one will study the perception of tactile forms and texture with Prof. André Moureau, the other will work on the perception of tactile movement with me.

What exactly is the Multitouch project?

Today, unsurprisingly, we interact more and more with screens in everyday life: to buy a train ticket, pay in the shop, get directions, etc. One thing is still missing from our screens: tactile or haptic feedback. For example, imagine that you touch a fish on your screen and feel its scales, temperature, even the water. Or imagine that you’re driving and you want to change the radio station, and on your screen you feel the buttons to do that.

What’s the value of this project?

First of all, its multisensory value would bring the experience closer to reality. Additionally, it could be of use when visibility is limited. In poor light or if you are driving, you could still use your screen. Finally, people deprived of vision could benefit. To date, blind people have used auditory coding software that reads aloud telephone menus, for example. This is already a miracle, but we could go further thanks to haptic feedback. Blind people could feel the buttons on their screen and navigate in a pleasant and ergonomic way.

That’s promising for them …

Yes, thanks to haptic feedback, one could even imagine that the blind could apprehend visual art. They could experience Klimt’s ‘The Kiss’ through touch rather than through an auditory description of what’s represented in the work.

At European level, what other specialities will be represented?

Collaboration is established between the University of Lille (France) with expertise in mechatronics and electrical engineering, as well as computer science; the Fondazione Istituto Italiano di Tecnologia (Italy) with expertise in psychology and neuroscientific and biomedical engineering; the Universitatea Stefan Cel Mare Din Suceava (Romania) with expertise in computer science and information and communication technology; and UCLouvain with expertise in psychophysics and neurophysiology.

What are the project’s prospects?

By pooling all this academic knowledge, Europe hopes to boost the development of a tablet that works with haptic feedback. And to do this, the universities will collaborate with industrial firms, with, who knows, in four years, at the end of the project, a marketable product?

Lauranne Garitte

 

Have a look on Olivier Collignon's bio

After earning a master’s degree in neuropsychology from the University of Liège and a master’s degree in cognitive science from UCLouvain, Olivier Collignon specialised in cognitive neuroscience, completing a PhD at UCLouvain on cerebral plasticity in case of blindness. He has pursued this subject throughout his career, including during two postdocs at the Université de Montréal’s Centre de recherche en neuropsychologie et cognition. Since 2012, he has led the UCLouvain research group Crossmodal Perception and Plasticity Lab (CPP Lab) and served as an associate professor at the University of Trento’s Centre for Mind and Brain Science (CIMeC). Since 2016, he has been an FNRS research associate and associate professor at UCLouvain. His research is funded by multiple UCLouvain sources (e.g. FSR, Louvain Cooperation), the FNRS (e.g. MIS, EOS), and European sources (e.g. ERC Grant, Marie Curie). His research focuses on the general question: How does a brain area develop, maintain and change its sensory and functional role?

Olivier Collignon, Visage de la recherche

Published on December 17, 2020