Free New eBook for Next 3 Days!
Science Fiction Short Story Sale Ends June 1st
Avar-Tek Event 3: Cohesion Lost
– “The writing is the best, it kept me going, and I wanted to find out what happened next. It’s a must read! …well written, and the characters are well developed.”
– “The story has mystery, conflict,and action, with the great plot twist that Justin Tyme does so well.”
The bionic eye is an artificial device that gives site to the blind or enhanced sight to those who can see on their own — a cybernetic device. Currently, bionic eyes for the blind are experimental with some success depending on when the patient was blinded and what cause the blindness. According to Wikipedia, there are at least 11 ongoing bionic eye projects:
- Argus Retinal Prosthesis
- Microsystem-based Visual Prosthesis (MIVIP)
- Implantable Miniature Telescope
- Tübingen MPDA Project Alpha IMS
- Harvard/MIT Retinal Implant
- Artificial Silicon Retina (ASR)
- Optoelectronic Retinal Prosthesis
- Dobelle Eye
- Intracortical Visual Prosthesis
- Virtual Retinal Display (VRD)
- Visual Cortical Implant
In the video below Miikka Tertho, a blind man, sees images for first time (filmed by the University of Tuebingen/Retina Implant AG). This is not Six Million Dollar Man stuff. (Remember that show?) This is not Lee Majors zooming in on the bad guy with a “boop-boop-boop-boop.” This is a blind who can see because he has something like parts of a camcorder stuck in his eye.
Seeing verses Perceiving
Mike May (not the guy in the video above) was 3 years old when a chemical explosion blinded him. In 2000 when he was 46 years old, he regained partial vision after a corneal transplantation stem cell procedure. Although he can see, he has difficulty perceiving. “May still has no intuitive grasp of depth perception. As people walk away from him, he perceives them as literally shrinking in size, problems distinguishing male from female faces, and recognizing emotional expressions on unfamiliar faces”(Wikipedia, Article: Mike May (skier))
One theory is that the temporal visual cortex uses prior memory and experiences to make sense of shapes, colors and forms. During our first five years of life outside the womb, our brains are building a library or database of images associated with their context. Over time, subtle cues are extracted from those images. The visual cortex compares the image we see now to those library of cues. But that part of the library of our brain is best stocked early when our brains are subtle. For Mike May, this part of the library was closed, but he was able to stock the cues in the sound and touch section. He developed very precise senses of hearing and touch.
The Problem with Adult Bionics
In the future when cybernetic replacements or enhancements are more common, it will also be necessary to fiddle with the mind to get the implants to work easily. A method might be found that will allow the patient to restock his or her visual database quickly. If not, then seeing will not equal perceiving.
Images and video courtesy of Retina Implant AG.
Earlier this week, The Associated Press reported that a paralyzed man remotely controlled a simple robot using only thoughts. (The images shown here are not from that experiment, but from one done three years ago by Honda. More on that later.)
The robot was a small, simple device that moved on wheels, was equipped with a camera, and had a laptop computer perched on top. The paralyzed man, Mark-Andre Duc, was 62 miles (100 km) away and controlled it using only a head cap while trying to raise his paralyzed fingers. The electroencephalogram (EEG) cap measured his brain signals, which were interpreted as command movements.
Both the researchers and the Mr. Duc admit it is not easy to use. Jose Millan, the team’s leader said, “Sooner or later your attention will drop and this will degrade the signal.” Mr. Duc told The Associated Press through the video link on the laptop, “When I’m in pain it becomes more difficult.”
Using measureable thoughts to control an electronic device isn’t totally unique.
- Spring 2006: Honda Research Institute in Japan used feedback from an MRI (Magnetic Resonance Imaging) machine to remotely control a robotic hand. (Shown in video below.)
- Spring 2009: A team lead by Javier Minguez at the University of Zaragoza in Spain worked on robotic thought manipulated wheelchairs.
- Spring 2009: Honda Research Institute in Japan demonstrated how their robot Asimo could lift an arm or a leg through signals from a user with EEG and NIRS (near-infrared spectroscopy sensors). (Shown in video below.)
- Fall 2009: Toy maker Mattel released a game based on a simplified version of this concept with mixed reviews.
- Fall 2010: A team lead by Rajesh Rao from Neural Systems Laboratory, University of Washington, not only working on mind control of a robot, but also to how to teach the robot simple tasks using the same mechanism (The Robot That Reads Your Mind to Train Itself).
- Spring 2011: A team lead by C.T. Lin from California State University at Northridge creates EEG cap driven wheelchair that adapts to the operator’s unique brain patterns. For obstacle avoidance, the wheelchair is also equipped with a laser sensor and cameras. (See their video.)
Intricate manipulation such as tying shoelaces is not possible yet with EEG caps since the signal is inherently too noisy. To get cleaner signals, we have to tap directly into the brain. Ouch. That painful subject is for another blog.