
Tags
What can your tongue see?
Picture this: you’re riding your bike on a cool fall afternoon. You’re blind but have no difficulty navigating through the streets without the aid of your eyes. Rather, the world comes into view through waves of sound bouncing off nearby objects in response to the bursts of clicking noises you create with your tongue. Since losing both of his eyes at a young age, Daniel Kish learned to navigate through the world using echolocation. This technique, which he calls flash sonar, is similar to the way bats construct images of the world in complete darkness.
So, what does it mean to see? At a basic level, we can describe “seeing” as the culmination of a multi-step process. First, specialized cells in the eye convert light into electrochemical signals. These signals are then passed on to the brain, where they propagate through regions that process this information in increasingly complex ways, eventually producing a perceptual representation of a visual scene. Put simply, a sensor collects information about the world and converts it into a signal that can be interpreted by the brain. In this way, we can think about our perception as being limited to that which our functional sensors detect. (for more on how we count and define senses, check out Ethan’s previous NeuWrite post!) Since not all animals have the same sensors, they achieve their own perceptual representations based on the information they are able to collect about the world. For example, flowers guide bees to their nectar with ultraviolet patterns that are invisible to the human eye, which only detects a tiny fragment of the electromagnetic spectrum known as visible light.

The same flower in visible (left) ultraviolet (center) and infrared (right) light
Image by Dave Kennard
When something goes awry with our sensors, as is the case in people who are blind or deaf, technology can step in to reroute information to the brain through another channel. This is the idea behind sensory substitution—a functional sense (e.g., touch) learns to interpret (e.g., feel) a sensory input typically perceived by another sense (e.g. vision). Just as Daniel Kish can “see” using sound, blind individuals can learn to “see” through a transformation of light information into vibrations. Paul Bach-y-Rita pioneered this work in the 1960s with the advent of a vest for the blind that transmitted information about visual scenes into vibrating patterns on the participant’s back [1]. As futuristic as it sounds, seeing through vibratory patterns isn’t much different than reading with your fingertips (as is the case in braille); it’s just that technology is expanding the realm of possibilities for how we interact with the world.
Wicab Inc., founded by Bach-y-Rita, now sells an updated version of this technology called the BrainPort device, which uses a glasses-mounted video camera to capture a scene and translates that visual information into electrical impulses delivered to the tongue’s surface via an electrode array.One of the biggest advantages to using the tongue is that it is densely populated with touch sensors, so that you are able to feel two points as distinct, even when they are very close to each other. Users liken the sensation to the fizzy feeling of Pop Rocks dissolving on the tongue. With training, participants using the BrainPort device are able to identify objects, recognize words presented on a computer screen [2], and even navigate through obstacle courses [3].

PET data from individual blind subjects after training with a tactile-vision sensory substitution device, with arrows indicating activity in the visual cortex [modified from 6].
The ability of sensory substitution devices to afford even limited functional vision to people who are blind marks a massive step in assistive technology. These devices are no longer relegated to laboratories where researchers tweak the technology and training protocols; the Food and Drug Administration recently approved the BrainPort device after reviewing its effectiveness and safety for public use. In light of the BrainPort and other devices that might allow people to hear visual scenes [7] or feel sound, it’s clear that our brain’s interpretation of sensory stimuli is more complicated than meets the eye.
References
[1] Bach-y-Rita, P., Collins, C.C., Saunders, F., White, B. & Scadden, L. Nature 221, 963–964 (1969).
[2] Nau, A.C., Pintar, C., Arnoldussen, A. & Fisher, C. Am J Occup Ther. 69, 6901290010p1–8 (2015).
[3] Nau, A.C., Pintar, C., Fisher, C., Jeong, J.H. & Jeong, K. J Vis Exp. 11, e51205 (2014).
[4] Gizewski, E.R., Gasser, T., de Greiff, A., Boehm, A. & Forsting, M. NeuroImage 19, 968–975 (2003).
[5] Lambertz, N., Gizewski, E.R., de Greiff, A. & Forsting, M. Brain Res Cogn Brain Res. 25, 884–890 (2005).
[6] Ptito, M., Moesgaard, S.M., Gjedde, A. & Kupers, R. Brain 128, 606–614 (2005).
[7] Brown, D., Macpherson, T. & Ward, J. Perception 40, 1120–1135 (2011).
Title Image: https://www.flickr.com/photos/53558245@N02/
Pingback: Hello Darkness My Old Friend: How Echolocation Lets Bats Rule the Night | Margot Blogs About the Brain