Blog Archives

Just Released: Cohesion Lost

Free New eBook for Next 3 Days!

Science Fiction Short Story Sale Ends June 1st

Avar-Tek Event 3: Cohesion Lost

eBook published May 2012
For Alexander Sevik, providing for his family is hard enough without losing grip on reality. His dreams are real. One night, he lives the entire life of a deckhand on a Spanish galleon. The next night, it’s life as an ancient Roman senator. Next, he is a cyborg on a space cruiser. When he wakes, he sometimes forgets who he is. His hands tingle for no reason, and the strange man who is following him talks about aliens. When he discovers the key to his dreams, he uncovers a national threat. And he has to choose between his own sanity or saving lives.
________________
Reader’s Reviews:
– “An wonderful futuristic look at the world where technology has pushed the limits of imagination but human nature remains the same.”
~ by Kay in an Amazon review
.
– “The writing is the best, it kept me going, and I wanted to find out what happened next. It’s a must read! …well written, and the characters are well developed.”
~ by Health Nut in an Amazon review
.
– “The story has mystery, conflict,and action, with the great plot twist that Justin Tyme does so well.”
~ by Shadowfeld in an Amazon review
.
Visit the Smashwords for the free story. If you do download this story and like it, please drop by my website & sign up for more.

.

DecNef – Learning Through Brain Mapping “Matrix” Style

** SPOILER ALERT **  If you have not read Cohesion Lost but plan to, then do not read this.  It contains information that will spoil the plot.

Researchers: Adult early visual areas are sufficiently plastic to cause visual perceptual learning.
Credit: Nicolle Rager Fuller, National Science Foundation

In the novel Cohesion Lost, Tenbu and his classmates use plexus beds to relive the lives of historic figures.  What a way to learn history.   Historical dates are no longer abstract.  When was the battle of Waterloo?

If you use plexus learning, you don’t have to memorize dates and events, you live them.  You don’t just read about historic icons, you meet them.  The plexus bed contains microscopic connectors that connect to your spinal cord and replace what you feel with what your character feels in an historical simulation. Let’s say you just finished the simulation of the Napoleonic Wars and someone asks you, when was the battle of Waterloo?  That’s easy.  It feels like last month.  You were there.

Can this work for real? We may never get that far, but recent research shows promise in the area of imprint learning (at Boston University and ATR Computational Neuroscience Laboratories in Kyoto, Japan).  “The technique is called “Decoded Neurofeedback”, or “DecNef”, and it involves using decoded fMRI to induce brain activity patterns that match a known state” (Scott Young, MedGadget: “Matrix” Style Learning Through the Visual Cortex).

.

** PLUG ALERT **

What will it be like future students with DecNef and plexus beds?

Read Cohesion Lost and find out.

.

Press release from the National Science Foundation: Vision Scientists Demonstrate Innovative Learning Method

Journal article in Science: Perceptual Learning Incepted by Decoded fMRI Neurofeedback Without Stimulus Presentation

Cybernetics: Bionic Eyes

A close-up view of Retina Implant’s subretinal implant. It measures 3×3 mm across and 70 microns thick. It contains 1500 microelectrodes that replace light receptors lost in macular degeneration.

The bionic eye is an artificial device that gives site to the blind or enhanced sight to those who can see on their own — a cybernetic device.   Currently, bionic eyes for the blind are experimental with some success depending on when the patient was blinded and what cause the blindness. According to Wikipedia, there are at least 11 ongoing bionic eye projects:

In the video below Miikka Tertho, a blind man, sees images for first time (filmed by the University of Tuebingen/Retina Implant AG).  This is not Six Million Dollar Man stuff. (Remember that show?) This is not Lee Majors zooming in on the bad guy with a “boop-boop-boop-boop.”  This is a blind who can see because he has something like parts of a camcorder stuck in his eye.

Seeing verses Perceiving

Mike May (not the guy in the video above) was 3 years old when a chemical explosion blinded him.  In 2000 when he was 46 years old, he regained partial vision after a corneal transplantation stem cell procedure.  Although he can see, he has difficulty perceiving. “May still has no intuitive grasp of depth perception. As people walk away from him, he perceives them as literally shrinking in size, problems distinguishing male from female faces, and recognizing emotional expressions on unfamiliar faces”(Wikipedia, Article: Mike May (skier))

Why?

One theory is that the temporal visual cortex uses prior memory and experiences to make sense of shapes, colors and forms. During our first five years of life outside the womb, our brains are building a library or database of images associated with their context.  Over time, subtle cues are extracted from those images.  The visual cortex compares the image we see now to those library of cues.  But that part of the library of our brain is best stocked early when our brains are subtle.  For Mike May, this part of the library was closed, but he was able to stock the cues in the sound and touch section.  He developed very precise senses of hearing and touch.

The Problem with Adult Bionics

In the future when cybernetic replacements or enhancements are more common, it will also be necessary to fiddle with the mind to get the implants to work easily.  A method might be found that will allow the patient to restock his or her visual database quickly.  If not, then seeing will not equal perceiving.

Images and video courtesy of Retina Implant AG.

Brain-Machine Interface (BMI) – Part 1

Honda demonstrates its brain-machine interface (courtesy Honda Research Institute)

Earlier this week, The Associated Press reported that a paralyzed man remotely controlled a simple robot using only thoughts.  (The images shown here are not from that experiment, but from one done three years ago by Honda.  More on that later.)

The robot was a small, simple device that moved on wheels, was equipped with a camera, and had a laptop computer perched on top.  The paralyzed man, Mark-Andre Duc, was 62 miles (100 km) away and controlled it using only a head cap while trying to raise his paralyzed fingers.  The electroencephalogram (EEG) cap measured his brain signals, which were interpreted as command movements.

Both the researchers and the Mr. Duc admit it is not easy to use.  Jose Millan, the team’s leader said, “Sooner or later your attention will drop and this will degrade the signal.”  Mr. Duc told The Associated Press through the video link on the laptop, “When I’m in pain it becomes more difficult.”

Using measureable thoughts to control an electronic device isn’t totally unique.

  • Spring 2006: Honda Research Institute in Japan used feedback from an MRI (Magnetic Resonance Imaging) machine to remotely control a robotic hand. (Shown in video below.)
  • Spring 2009: A team lead by Javier Minguez at the University of Zaragoza in Spain worked on robotic thought manipulated wheelchairs.
  • Spring 2009: Honda Research Institute in Japan demonstrated how their robot Asimo could lift an arm or a leg through signals from a user with EEG and NIRS (near-infrared spectroscopy sensors). (Shown in video below.)
  • Fall 2009: Toy maker Mattel released a game based on a simplified version of this concept with mixed reviews.
  • Fall 2010: A team lead by Rajesh Rao from Neural Systems Laboratory, University of Washington, not only working on mind control of a robot, but also to how to teach the robot simple tasks using the same mechanism (The Robot That Reads Your Mind to Train Itself).
  • Spring 2011: A team lead by C.T. Lin from California State University at Northridge creates EEG cap driven wheelchair that adapts to the operator’s unique brain patterns. For obstacle avoidance, the wheelchair is also equipped with a laser sensor and cameras. (See their video.)

Intricate manipulation such as tying shoelaces is not possible yet with EEG caps since the signal is inherently too noisy.  To get cleaner signals, we have to tap directly into the brain.  Ouch.  That painful subject is for another blog.