Can we see with our ears?
The EyeMusic sensory substitution device (SSD), developed by researchers at the Hebrew University of Jerusalem in Israel, produces sounds based on what its camera can see. Blind people trained to understand the sounds can perceive images without using their eyes, and are able to describe faces, understand other people’s emotions, identify body postures, read words and even select the only red apple from a plate of fruit.
It works by representing the height of objects by the pitch of the sound – tall objects are high-pitch; low objects are lower pitch.
The width of an object is represented by the duration of the sound; colour is represented by different musical instruments including the violin, trumpet and organ and different colour shades are conveyed through volume manipulations.
The result might sound a little grating but, with the right training, users can develop a fluent understanding of the tones, allowing them to form images in their minds. The work is opening up the possibility of using these devices in rehabilitation programmes to improve blind people’s independence and quality of life.
For the researchers, one of the most striking findings from their work has been how the brain reacts to this new sensory input. Although the ears are doing all the work in detecting signals from the outside world, it is the brain’s visual centres that light up when viewed using functional MRI (fMRI) scanning – a technology that shows which brain areas are most active while performing a specific task.
This has contributed to the conclusion that humans see with their brains rather than their eyes and that the brain is much more flexible than previously imagined.
‘Our work with SSDs highlights that the brain is a flexible task-machine rather than a sensory machine as previously conceived,’ said Professor Amir Amedi who leads the BRAINVISIONREHAB project, which developed the EyeMusic device and was funded by the EU’s European Research Council.
‘The exact same areas found to be in use in the sighted population when perceiving body shapes through vision, for instance, were activated in the congenitally blind population when perceiving the same body shapes via audition, using an SSD.’
Blind people have successfully used the EyeMusic device to describe faces. Video courtesy of BRAINVISIONREHAB
Using hearing instead of sight is not the only sensory substitution that Prof. Amedi’s team has been working on. Their EyeCane is a device that vibrates when a blind person waves it in front of them, providing information about the distance between the user and objects around them. This allows blind people to navigate their way around a room or through a maze, ultimately building a 3D map of their environment in much the same way as a bat navigates using sound.
‘We are working to bring this device to the blind community as soon as possible,’ said Prof. Amedi. ‘It is both relatively cheap to manufacture and incredibly intuitive to learn.’
The group is also working on combining the EyeMusic and the EyeCane together to further enrich the amount of information that blind users can perceive through sensory substitution devices.
Adaptable
As researchers delve deeper into the science of the senses, they are discovering just how adaptable our brains are to dealing with different sensory inputs. Prof. Amedi’s lab has been studying what happens in the brain when people read Braille – a tactile writing system used by people who are blind.
“‘The brain is a flexible task-machine rather than a sensory machine.’
It was already known that in people who can see, a region of the brain called the Visual Word Form Area develops when they learn to read with their eyes. Using fMRI, researchers found that the same region is active when blind people read Braille.
However, as part of the EU-funded METABRAILLE project, Dr Marcin Szwed, at the Jagiellonian University in Krakow, Poland, is studying what happens when people who can see read Braille.
Dr Szwed decided to teach sighted people to read Braille over a nine-month period and scanned their brains throughout the process.
‘We found that when subjects read the tactile alphabet the most active area of the brain is the visual cortex rather than the areas associated with touch,’ he explained.
To confirm their finding, the team used magnets to temporarily inhibit the visual cortex. With their visual centres scrambled, the subjects’ tactile reading was impaired.
Dr Szwed said these studies give further insights into the plasticity of the brain, showing that areas once thought of as being dedicated to a single set of tasks – such as touch or sight – can be recruited to other jobs. It is also clear that there is a significant communication between these areas when the brain is performing several related processes at once.
This, he suggests, could help to explain why humans are so good at learning complex tasks such as driving or learning to play musical instruments. Our brains are much more flexible than we think, meaning it may be time to update how we view it.
‘If you look at medical textbooks, the pictures of the brain make it look like each brain area has a fixed role,’ said Dr Szwed. ‘What I hope to achieve is to have that picture changed.’