Brain-Machine Interface (BMI) – Part 1


Honda demonstrates its brain-machine interface (courtesy Honda Research Institute)

Earlier this week, The Associated Press reported that a paralyzed man remotely controlled a simple robot using only thoughts.  (The images shown here are not from that experiment, but from one done three years ago by Honda.  More on that later.)

The robot was a small, simple device that moved on wheels, was equipped with a camera, and had a laptop computer perched on top.  The paralyzed man, Mark-Andre Duc, was 62 miles (100 km) away and controlled it using only a head cap while trying to raise his paralyzed fingers.  The electroencephalogram (EEG) cap measured his brain signals, which were interpreted as command movements.

Both the researchers and the Mr. Duc admit it is not easy to use.  Jose Millan, the team’s leader said, “Sooner or later your attention will drop and this will degrade the signal.”  Mr. Duc told The Associated Press through the video link on the laptop, “When I’m in pain it becomes more difficult.”

Using measureable thoughts to control an electronic device isn’t totally unique.

  • Spring 2006: Honda Research Institute in Japan used feedback from an MRI (Magnetic Resonance Imaging) machine to remotely control a robotic hand. (Shown in video below.)
  • Spring 2009: A team lead by Javier Minguez at the University of Zaragoza in Spain worked on robotic thought manipulated wheelchairs.
  • Spring 2009: Honda Research Institute in Japan demonstrated how their robot Asimo could lift an arm or a leg through signals from a user with EEG and NIRS (near-infrared spectroscopy sensors). (Shown in video below.)
  • Fall 2009: Toy maker Mattel released a game based on a simplified version of this concept with mixed reviews.
  • Fall 2010: A team lead by Rajesh Rao from Neural Systems Laboratory, University of Washington, not only working on mind control of a robot, but also to how to teach the robot simple tasks using the same mechanism (The Robot That Reads Your Mind to Train Itself).
  • Spring 2011: A team lead by C.T. Lin from California State University at Northridge creates EEG cap driven wheelchair that adapts to the operator’s unique brain patterns. For obstacle avoidance, the wheelchair is also equipped with a laser sensor and cameras. (See their video.)

Intricate manipulation such as tying shoelaces is not possible yet with EEG caps since the signal is inherently too noisy.  To get cleaner signals, we have to tap directly into the brain.  Ouch.  That painful subject is for another blog.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s