In the book On Intelligence, Jeff Hawkins (founder of Palm Computing) makes the argument that the roughly about 30 billion neurons and 30 trillion synapses (junctions between neurons) in our brain are much like general purpose computers and can be trained to process information that they were not even initially supposed to.
Most of sight doesn’t happen in the eyes. The eyes are just sensors that feed data that is processed in the brain. As shown in the video below, Erik Weihenmayer has been able to regain some sight by using a Brainport device that takes video and sends signals to his tongue. From this, he’s about to “see” some objects around him, even read simple text, and play tic-tac-toe with his daughter.
This makes me wonder how the devices around us could be interacted with in more ways than just visually and audibly. For instance, Google Glass vibrates at the side of your head. Could it lead you around a city giving you directions not using the display at all? I can imagine different pulses in the vibrations let you know how soon you have to turn, and even different pulses would tell you to turn either left or right.