LAS VEGAS — It’s not the self-driving cars, flying cars or even the dish-washing robots that stick out as the most transformative innovation at this year’s Consumer Electronics Show: It’s the wearable gadgets that can read your mind.
There’s a growing category of companies focused on the “Brain-Computer Interface.” These devices can record brain signals from sensors on the scalp (or even devices implanted within the brain) and translate them into digital signals. This industry is expected to reach $1.5 billion this year, with the technology used for everything from education and prosthetics, to gaming and smart home control.
This isn’t science fiction. I tried a couple of wearables that track brain activity at CES this week, and was surprised to find they really work. NextMind has a headset that measures activity in your visual cortex with a sensor on the back of your head. It translates the user’s decision of where to focus his or her eyes into digital commands.
“You don’t see with your eyes, your eyes are just a medium,” Next Mind CEO Sid Kouider said. “Your vision is in your brain, and we analyze your vision in your brain and we can know what you want to act upon and then we can modify that to basically create a command.”
Kouider said that this is the first time there’s been a brain-computer interface outside the lab, and the first time you can theoretically control any device by focusing your thoughts on them.
Wearing a Next Mind headset, I could change the color of a lamp — red, blue and green — by focusing on boxes lit up with those colors. The headset also replaced a remote control. Staring at a TV screen, I could activate a menu by focusing on a triangle in a corner of the screen. From there, focusing my eyes, I could change the channel, mute or pause video, just by focusing on a triangle next to each command.
“We have several use cases, but we are also targeting entertainment and gaming because that’s where this technology is going to have its best use,” Kouider said. “The experience of playing or applying it on VR for instance or augmented reality is going to create some new experiences of acting on a virtual world.”
Next Mind’s technology isn’t available to consumers yet, but the company is selling a $399 developer kit with the hope that other companies to create new applications.
“I think it’s going to still take some time until we nail … the right use case,” Kouider said. “That’s the reason we are developing this technology, to have people use the platform and develop their own use cases.”
Another company focused on the brain-computer interface, BrainCo, has the FocusOne headband, with sensors on the forehead measuring the activity in your frontal cortex. The “wearable brainwave visualizer” is designed to measure focus, and its creators want it to be used in schools.
“FocusOne is detecting the subtle electrical signals that your brain is producing,” BrainCo President Max Newlon said. “When those electrical signals make their way to your scalp, our sensor picks them up, takes a look at them and determines, ‘Does it look like your brain is in a state of engagement? Or does it look like your brain is in a state of relaxation?’”
Wearing the headband, I tried a video game with a rocket ship. The harder I focused, the faster the rocket ship moved, increasing my score. I then tried to get the rocket ship to slow down by relaxing my mind. A light on the front of the headband turns red when your brain is intensely focused, yellow if you’re in a relaxed state and blue if you’re in a meditative state. The headbands are designed to help kids learn to focus their minds, and to enable teachers to understand when kids are zoning out. The headband costs $350 for schools and $500 for consumers. The headset comes with software and games to help users understand how to focus and meditate.
BrainCo also has a prosthetic arm coming to market later this year, which will cost $10,000 to $15,000, less than half the cost of an average prosthetic. BrainCo’s prosthetic detects muscle signals and feeds them through an algorithm that can help it operate better over time, Newlon said.
“The thing that sets this prosthetic apart, is after enough training, [a user] can control individual fingers and it doesn’t only rely on predetermined gestures. It’s actually like a free-play mode where the algorithm can learn from him, and he can control his hands just like we do,” Newlon said.
[“source=cnbc”]