Two Brains Act As One Super Pilot with Machine Learning

Two brains are better than one! At least it is when using a simulator to navigate a virtual spacecraft.

More specifically, two people’s thoughts were far more accurate than either person’s acting alone.

Paul Marks at the NewScientist reports on how a team at the University of Essex in the UK researching brain-computer interfaces (BCI) arrived at this conclusion. The team will present its findings in March at the Intelligent User Interfaces conference.

Their experiment suggests “collaborative BCI” could flourish in the future in fields like robotics and telepresence.

EEG electrodes were hooked up to the scalps of both navigators. As each person thought about a predefined concept like “left” or “right,” machine learning software identified patterns in their thinking and applied them in real time.

For the participants, the challenge was to steer the craft to the absolute center of a planet using a repertoire of 8 directional thoughts. Machine learning software merged their separate thoughts into continuous actions that directed the craft.

The researchers found simulation flights using collaborative BCI were 90% on target versus 67% for solo pilots.

Even when sudden changes in the planet’s position were introduced, having additional brainpower cut human reaction time in half.

Also, since EEG signals are often fraught with noise, having signals from two brains helped maintain a more usable signal to noise ratio. It also compensated for lapses of attention when one person’s mind momentarily strayed.

While remotely controlling actual spacecraft using collaborate BCI is still a long way off, more modest uses are altogether feasible today.

For example, enabling people with disabilities to steer a wheelchair with their thoughts.

So, how do you see collaborative BCI being used in your world?

Comments are closed.