The human tongue can help Blind People See The World. Here’s how

The human tongue can help Blind People See The World. Here's how

Have you ever wondered why it feels more comfortable in comparison to holding hands? The tongue is an impressive piece of kit however, it is notoriously difficult to understand because of its location within the mouth.

It is evident that it allows us access to the world of flavor, but more than that, it is more sensibility for touch than the fingertips. Without it, we’re not in a position to sing, speak and breathe effectively and swallow delicious drinks.

Why don’t we make use of it more?

My current research examines ways you can make use of this bizarre organ, possibly as a means of helping those with visual impairments use their eyes to navigate and exercise. It’s true that this sounds strange, but please take a moment to listen.

My research falls within the field of study called’sensory substitution’, which is a subfield of science that blends neuroscience, psychology as well as computer science and engineering to design’sensory replacement device’ (known as SSDs).

SSDs transform sensory information between different senses. For example, if a device is specifically designed for someone with visually impaired eyes it is typically converting the visual information of the video feed to audio or tactile.

Drawing images on the tongue

BrainPort was first created in the year 1998, was an example of such technology. It converts a camera’s feed into a series of moving designs of stimulation across the tongue’s surface.

The “tongue display” (a small device that looks like an lollipops) is made up of just small electrodes each electrode being corresponding to the pixel of the camera’s feed. It produces a low-resolution visual display that is tactile on your tongue that matches the output of the camera.

The technology is able to aid stroke patients keep their balance. In 2015 The US Food and Drug Administration has approved the use of the technology to serve as an aid to the visually challenged.

Imagine holding your hand to a camera, and experiencing a tiny hand appear at the top of your tongue. It’s kind of as if someone is drawing drawings on your tongue with pop candy.

Although it is true that the BrainPort has been in use for some time however, it hasn’t experienced significant use in the real world despite being more than ten times the cost of the retinal implant. I utilize the BrainPort to determine the human’s attention on the tongue’s surface to determine if any changes in perception could be the cause for this.

In the field of psychology research there is a well-known technique to measure attention. It is known as the Posner Cueing paradigm that is named after its creator, American psychology researcher Mike Posner who invented it in the 80s to assess the visual attention of people.

When I refer to attention, I’m not referring to “attention span’. Attention refers to the sequence of actions that bring objects out of the environment to our attention. Posner discovered that attention is triggered through visual cues.

If we glimpse something moving away from the angle of our vision our attention is drawn to the location. This is probably how we evolved to be quick to react to danger snakes that are lurking in corners and at the outer edges of our vision field.

The process is also triggered in the realm of senses. If you’ve ever sat down in the garden of a pub in summer and heard the unwelcome buzz of an intruder on one ear, you’ll notice that your attention is quickly drawn to the other side that is on your back.

The sounds of the wasp draw your attention on its general position of coming wasp so that your brain is able to quickly focus its visual attention to determine the exact position of the wasp. The brain also uses the tactile attention is used to swiftly swat or avoid the wasp.

This is known as “cross-modal” awareness (vision as one type of feeling, audio is another) Things that are visible in one sense may affect other senses.

Pay attention to your tongue

My coworkers and I came up with an alternative to the Posner Cueing model to test whether the brain is able to concentrate its attention on the tongue’s surface similarly to the hands or other forms of attention.

We have a wealth of knowledge about attention to visuals, as well as tactile awareness of the hands and bodily parts. However, we we have no idea if that knowledge can be applied onto the tongue.

This is vital as BrainPort is built, designed and sold to assist people perceive what they see through their tongues. However, we must understand what’seeing’ through the tongue is equivalent to being able to see with eyes.

What you can do, just like nearly everything else in life is that it’s complex.

The tongue is able to respond to signals in approximately similar ways to vision or hands, but despite the amazing sensitiveness that the tongue has, its attentional processes aren’t as powerful when compared to the other senses. It’s simple to over-stimulate the tongue, resulting in sensory overload , which can make it difficult to comprehend what’s happening.

Also Read: What’s the Difference Between Migraine and Headaches?

We also discovered that the attentional processes of the tongue are influenced by sounds. For instance when the BrainPort user detects a sound that is to the left, they are able to better identify information that is located on the left side that their tongue. This can help guide the attention and lessen sensory overload using the BrainPort in conjunction by an audio interface.

When it comes to the use in real-world situations of the BrainPort it is managing the amount of the visual information that is substituted. If it is possible make use of another sense to help spread some of the burden.

The BrainPort as a stand-alone device may be overstimulating and not provide accurate information. This could be enhanced by using assistive technologies along with it, like using the VOICe.

We’re using our findings to create a device to assist rock climbers who have visual impairments climb with confidence..

To avoid overload of information To avoid information overload, we’re employing the machine-learning process to recognize climb holds and eliminate unnecessary information.

We’re also examining ways to use sound as a signal what the next hold will occur, then using the tongue’s feedback to pinpoint the hold.

With a few adjustments the technology could eventually be a more reliable tool that can help deaf, blind or visually impaired people navigate. It could even aid those who are paraplegic, and unable to make use of their hands, or communicate better. The Conversation


Please enter your comment!
Please enter your name here