Mind-Controlled Technology Extends Beyond Exoskeletons

A wearable robot controlled by brain waves will take center stage at the World Cup this week, but it’s not the only mind-controlled tech out there

Brazil Exoskeleton
The mind-controlled exoskeleton developed by Miguel Nicolelis and his colleagues will allow a paralyzed teenager to make the ceremonial first kick of the 2014 World Cup. Danilo Borges/World Cup Portal

Today, in São Paulo, Brazil, a paralyzed teenager will don an exoskeleton controlled by his own brain waves, walk onto the pitch at Itaquera stadium and kick a soccer ball. It’ll be quite an amazing feat if all goes according to plan.

The Walk Again project is soccer-meets-neuroscience on a global stage. One of the project’s neuroscientists and a native Brazilian, Miguel Nicolelis has even compared the endeavor to putting a man on the Moon. However, critics—both scientific and social—hounded the research team as they scrambled to get the exoskeleton through sufficient clinical testing and ready for their deadline. While some scientists question whether the technology behind the exoskeleton is ready for prime time, others worry that the project is more publicity stunt than scientific demonstration.

Whatever one might think of the demonstration this week, mind-controlled technology is here to stay. In fact, there’s actually a wide range of technology you can control with your mind. Here’s an overview:

Exoskeletons

(A patient wears an EEG cap to test the Walk Again exoskeleton in Brazil. Photo: Danilo Borges/World Cup Portal.)

For decades, engineers and neuroscientists have been teaming up to build better, smarter wearable robots for paraplegics. Here’s how it works: A cap lined with electroencephalography (EEG) sensors picks up nerve signals across the brain through the skull. EEG can’t target specific areas of the brain, but a finely tuned computer algorithm can zero in on the brain signal that’s saying “walk forward” or “turn left.” These signals are translated into electronic commands that trigger movement in the exoskeleton, and so move the limbs of the person wearing it.

Currently, commercially available exoskeletons carry large battery packs and only allow the user to move slowly. Users control the robot with a joystick or by shifting their weight—similar to a Segway. Some models are lined with sensors that stimulate leg muscles, as well.

The particular exoskeleton that will be unveiled at the World Cup still appears large and clunky, but it does allow the user to control the machine using thought alone. The project is a collaboration between scientists across the globe: researchers in Europe designed and built the skeleton, Nicolelis’s lab at Duke University developed the computer algorithms to translate the user’s brain signals into movements, and a facility in São Paulo oversaw clinical testing in local patients.

This and other EEG-controlled exoskeletons serve as alternatives to using electrode implants in specific regions of the brain responsible for motor control. EEG systems and neural implants both constitute brain-computer interface technologies, and while implants could target the precise neurons that fire when a person thinks about taking a step, they require surgery.

Both interfaces could hypothetically work with an exoskeleton, and Nicolelis originally planned to use an implant in the World Cup demonstration. But, the ease and safety of EEG has made it popular among exoskeleton researchers in recent years.

Bionic Limbs

(The DEKA Arm is capable of handling objects as delicate as grapes and eggs as well as manipulating power tools, such as a hand drill. Photo: DARPA)

In 1999, Nicolelis and some colleagues successfully constructed a working bionic arm for rats. Rats controlled the arm using implanted electrodes in the area of their brain responsible for voluntary muscle movements.

Since then, bionic limbs have come a long way. Implants are more invasive than EEG, but they can target specific neural regions (rather than scanning the whole brain). In research trials, bionic arms have allowed paraplegics to grasp and carry objects like chocolate or a cup of coffee. These arms are large, cumbersome, and primarily aimed at those with a high degree of paralysis.

For amputees, myoelectric prosthetics can translate neural signals from muscles in the remaining part of the limb into movement in the arm. Electromyogram sensors pick up on muscle movement in the upper limb and tell the prosthetic how to move. The FDA approved its first such mind-controlled prosthetic limb, the DEKA arm, earlier this year and similar devices are in clinical testing.

Other mind-controlled prosthetics use special electrodes inserted into two arm nerves (the ulnar and median) to give the wearer a basic sense of touch. Targeted muscle re-innervation, a surgical procedure that rewires some of the neural circuitry at the amputation site, can also improve an individual’s control and manipulation of a bionic prosthetic. 

Flying Robots

(Researchers at the University of Minnesota test their brain-controlled quadcopter. Photo: University of Minnesota)

A team at the University of Minnesota is experimenting with a mind-controlled helicopter that also functions via EEG-cap. The 64 electrodes that line the cap pick up signals from the brain and transmit those signals via wifi to a quadcopter—a small drone with four propellers (sort of like a remote-controlled helicopter but with a smarter computer brain).

While most EEG caps used in neuroscience and prosthetic research function with as many as 100 electrodes, an EEG headset—a device that looks like an old-school Walkman—only number in the teens. In 2012, researchers at Zhejiang University in China used such a headset to come up with a quadcopter flying system. The headset wearer thinks “left”, said headset relays the neurological message to a laptop, and the laptop tells the flying robot to go left.

Ultimately both of these brain computer interface projects have loftier goals beyond flying robots. The aim was to give the mobility-impaired a new avenue to interact with the world around them through electronics. Someday such a system could, for example, allow a wheelchair-bound individual to easily use appliances, flip lights on or off, or use a computer. 

Wheelchairs

(Michele Tavella, a graduate student at the Federal Institute of Technology in Lausanne, Switzerland, drives an electronic wheelchair using an EEG cap. Photo: EPFL)

Given that researchers are interested in developing systems that increase paraplegics’ independence, it follows that mind-controlled wheelchairs are in the works.

In 2009, researchers at Toyota and RIKEN (short for Rikagaku Kenkyūsho) labs successfully built an electronic wheelchair guided by an EEG cap. Perhaps its most impressive feature is that the cap only took 125 thousandths of a second to turn a thought into a directional command. The researchers claimed it worked 95 percent of the time.

Then in 2010, a team out of Switzerland’s Federal Institute of Technology in Lausanne came out with a prototype wheelchair that could both read a person’s thought commands and take into account information on obstacles in the wheelchair’s path from two webcams mounted on the chair. It’s called “shared control.” Users don’t have to constantly tell the chair what to do.

Thanks to EEG headsets, there’s also a DIY mind-controlled wheelchair option where users buy a headset and rig the wheelchair, software and headset to work together. No matter the approach, users would need to train the brain to give clear commands, and for now those commands are pretty limited (turn left, right, go forward, etc.).

Other groups are moving beyond wheelchairs and putting EEG tech in cars, too, but don’t expect to see mind-controlled vehicles on the highway anytime soon. 

Games

(NeuroSky CEO Stanley Yang demonstrates the Mind Flex game, a Mattel product that uses NeuroSky's brain control technology. Photo: © Lea Suzuki/San Francisco Chronicle/Corbis)

EEG headsets have moved beyond the realm of the research lab and into commercial venues. In 2007, the San Jose-based company NeuroSky came out with an EEG-headset and companion game called the Adventures of NeuroBoy, where the player could use “telekinetic powers” to move and manipulate objects.

Since then, a slew of EEG-incorporated games have entered the market, including the Star Wars-inspired Force Trainer, in which users can manipulate a ball in a tube using “The Force.”  Impressed, we are.

In 2009, Emotiv, a neuroimaging company based in San Francisco, California, came out with an EEG headset called EPOC that employs just 14 electrode sensors. Emotiv also came up with software packages designed to interpret different types of brain signals—one interprets facial expressions, another emotional states—to enhance the gaming experience.

Because headset wearers need to train and focus their mind on a specific task to produce the desired results in a game, some neuroscientists think that the EEG-gaming technology could have future applications in education or therapy for those with attention disorders like ADHD. 

Music

(The MiND ensemble performs at the University of Michigan in 2012. Photo: Robert Alexander/MiND Ensemble)

Given that scientists can already transform brain waves into music, it’s no surprise that some musical projects take the pen and paper or the mouse clicking involved in composition software out of the picture.

At the University of Michigan, a performance group called the MiND ensemble (short for Music in Neural Dimensions) uses Emotiv’s EPOC headset to create music—literally—with their minds. Made up of researchers, musicians, and artists, the MiND ensemble has developed software that translates thought into sound—a thought equates to a musical note, in this case. Here, one can compose music or even just play a virtual instrument just by thinking.

Another music/art project called NeuroDisco uses the EPOC headset to control and generate both rhythm and lights. The headset picks up on emotional and cognitive signals of the wearer and transmits to a laptop. Meanwhile, specially designed software maps brain signals to rhythm and light fixture patterns. The wearer has to train their brain to give different cognitive commands to produce different rhythms.

For the music listener, there’s always Neuro Turntable, a music player that can be turned on and off using an EEG headset. It comes in both app and high-tech record player form.

Furry Ears And Tails

(Hanako Miyake demonstrates Nerowear's 'Necomimi ' at a Tokyo event in 2011. The table to the left shows her brainwaves. Photo: © KIM KYUNG-HOON/Reuters/Corbis)

Neuroscience-inspired fashion is perhaps the oddest of mind-controlled products. For example, a device called Shippo uses a NeuroSky EEG headset to control a furry, robotic tail. And Necomimi Cat Ears, developed by the Japanese tech company Neurowear, use EEG to move fuzzy ears on a headband according to the user’s mood—relaxed, interested, etc.

The ears, incidentally, could serve a practical purpose: screening phone calls. At the AT&T Hackathon last year, neuroscientist Ruggero Scorcioni proved that the headset can also connect to a smartphone. Scorcioni developed an app called Good Times, which blocks or accepts calls also based on the wearer’s emotional state as reflected by the ears.

Think of it—mind-controlled technology not only could take Halloween costumes to the next level, but could allow you to control all aspects of your aesthetic with just a thought. 

Get the latest Science stories in your inbox.