SAN FRANCISCO BAY -- Singularity Hub: Take a second and concentrate on your surroundings: the subtle flickering of your laptop screen, the faint whiff of lingering coffee, the muffled sounds of traffic, the warm touch of sunlight peeking through your window.
To activate this feature, press the "CC" button.
We owe our understanding of the world to our various senses. Yet what we naturally perceive is only a sliver of the physical world.
The eerie beauty of infrared is beyond our grasp, as are the air compression waves that bats use for navigation, or the electromagnetic fields that constantly course through our bodies.
“Your senses limit your reality,” said Stanford neuroscientist Dr. David Eagleman at the TED conference last year in Vancouver, British Columbia.
We are slaves to our senses, and when we lose one, we also lose our ability to perceive that fraction of the world. Take hearing, for example. Although cochlear implants somewhat restore sound perception as an inner ear replacement, they’re pricey, surgically invasive and very clunky. They also don’t work very well for congenitally Deaf people when implanted later in life.
According to Eagleman, replacing faulty biological sensory hardware is too limited in scope.
What if, instead of trying to replace a lost sense, we could redirect it to another sense? What if, instead of listening, we could understand the auditory world by feeling it on our skin? And what if, using the same principles, we could add another channel to our sensory perception and broaden our reality?
Our “Mr. Potato Head” Brain - Eagleman’s ideas aren’t as crazy as they sound.
Our brain is locked in a sensory vacuum. Rather than vision, smell, touch or sound, it only understands the language of electric-chemical signals that come in through different “cables.” In essence, our valued peripheral organs are nothing but specialized sensors, translating various kinds of external input - photons and sound waves, for example - into electricity that feeds into the brain.
“Your brain doesn’t know and it doesn’t care where it gets the data from,” says Eagleman. Your ear could be a microphone, your eye a digital camera, and the brain can still learn to interpret those signals. That’s why cochlear and retinal implants work... Read More at Singularity Hub.