Prospects for Brain-Computer Interfacing

January 25, 2014 — Leave a comment

A group of undergraduates at Northeastern University demonstrated in June that they could steer a robot via thought. The subject in the experiment watched a computer screen and selected commands using his retina, causing electrical activity in the brain’s visual cortex ranging from 4 to 100 hertz. The signals were then translated to a small robot, similar to the Roomba vacuum cleaner.

Electrical engineering professor Deniz Erdogmus, who oversaw the project at Northeastern, says that because the connection between the user and the robot is Internet-based (you can track the robot over Skype) an operator could control it from a considerable distance away.

“We could take the robot to Tahiti and the operator can take a webcam tour,” says Erdogmus. “We are looking for volunteers to take the robot to Bora Bora.”

The demonstration was the latest in a string of breakthroughs over the last decade, showing the growing viability of brain-computer interface, or BCI, technologies. Cybernetic research will advance far more rapidly in the next few years, experts contend.

The Present and Future of Brain- Computer Interface Technology

Neural interface technology goes back half a century (and the larger field of cybernetics dates back to World War II), but advancement proceeded unevenly. The primary obstacle was, and remains, system compatibility; the delicate and complicated web of nervous tissue that is the brain doesn’t communicate well with wires and electronics.

“If you put an array of sensors into a brain, there’s a tissue reaction, namely scarring. The nervous tissue can no longer send a signal when there’s scar tissue,” says Klaus-Robert Müller, director of the machine learning group at Technische Universität in Berlin.

Previous studies have shown that linking mammalian brain matter with electric circuitry has a burning or melting effect on the brain. However, in the last two decades, advances in computation have enabled researchers to bypass this problem, somewhat, and rely more on devices that don’t have to be surgically implanted to collect brain signals.

Electroencephalography (EEG), which the Northeastern University team used, is among the favored of these techniques. EEG uses a sensor array afixed to a subject’s head externally, like a swimming cap. Because the signal from an EEG is weaker than the signal from a surgically implanted sensor, more guesswork is required to deduce what the brain is trying to communicate; that guesswork is aided through algorithmic math. Noninvasive BCI relies much more on algorithms and mathematic problem solving.

Erdogmus says that more funding agencies are seeing the potential of BCI, and this is having a positive effect across the field. “Technological and algorithmic advances allowed more groups to work on this problem for [less] equipment-wise,” he says.

Müller agrees that shifting more of the burden to number crunchers (helped by the increase in computing power in recent years) has made a big difference. “We have put all the learning on the machine side. The computer learns to interpret your brain waves,” he says. A few years ago, subjects would need to train for 300 hours to control their brain signals before those waves could be usable in BCI. Now, says Müller, you can achieve the same effect after about five minutes of training.

Brain-based control of conventional keyboards, allowing individuals to type without physically touching the keys, has been demonstrated at the universities of Wisconsin and Michigan. In the near future, brain e.mailing and tweeting will become far more common, say experts (though these interfaces remain extremely slow). BCI will also show up in some surprising places.

Other Applications for Brain-Computer Interfaces

In the near term, video-game makers could use BCI to develop gaming systems capable of reading and responding to a player’s emotional state. Similar research could lead to new therapies for various neurotic disorders, enabling sufferers to see and potentially moderate their own brain patterns to reduce stress. Müller reports that a company called Pico has designed an iPhone application that allows users to see their own thought patterns on the iPhone. (He says the app is not yet commercially available, as it requires a surgical implant to operate.) Automobile manufacturers might use BCI to improve navigation systems.

“Say you’re a carmaker; you are designing a new driver-assistance system,” says Müller. “Normally if you were testing this system, you would have people come in, try the car, and you would survey them on their experience. But what if you wanted a highly accurate qualitative measure to see if cognitive workload was lower using one gadget over another? Or you wanted to see how people reacted emotionally to different designs? These things can be measured non-intrusively and quantitatively.”

Erdogmus sees brain-controlled prosthesis and robots going mainstream within a few decades. There have been a number of startling demonstrations on this front in addition to the work at Northeastern. In 2008, a University of Pittsburgh team led by Andrew Schwartz taught a monkey to feed itself using a robot arm that the monkey controlled via implant. (A link to the video is available on THE FUTURIST’s Web site.)

Researchers caution that they need much more information about the brain, particularly its feedback mechanisms and how it transitions between different states, before science can fulfill the more ambitious cybernetic visions of science fiction. Acquiring this information will be the most important application of BCI in the years ahead. -Patrick Tucker

Sources: Personal Interviews, Deniz Erdogmus (e.mail) Northeastern University, http://www.northeastern .edu. Klaus-Robert Muller, Technische Universitat. Suggested further reading: Toward Brain Computer Interfacing edited by Guido Dornhege et al. MIT Press, 2007.

Sidebar

Students at Northeastern University have successfully steered a robot via brain signals.

In 2008, a University of Pittsburgh team led by Andrew Schwartz taught a monkey to feed itself using a robot arm that the monkey controlled via implant. Researchers see brain-computer interface technology making considerable progress in the years ahead. Despite this, progress in robotic prostheses interfacing will lag brain-PC communication, perhaps by a decade or more.

 

Originally published in THE FUTURIST, September-October 2010

Advertisements

No Comments

Be the first to start the conversation!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s