How to be a Mind Reader: Brain-Computer Interface Gives a Voice to Those Who Cannot Speak

by: Katerina Furman

Until now, mind-reading seemed like a distant fantasy. However, with the new Brain Computer Interface developed by the BioX scientists of Stanford University, it has become a feasible reality. The brand new technology directly reads brain signals to type out words as they are thought.  The interface is working on methods of direct brain interface technology that could provide a means of communication for individuals with Amyotrophic Lateral Sclerosis (ALS) and other forms of severe paralysis. Previous models of this technology have been tested, but this is the first model that “enables a typing rate sufficient for meaningful conversation,” according to Paul Nuyujukian, a post-doctoral fellow in Stanford’s Department of Neurosurgery and one of the developers of the brain-tracking interface. If developments continue on their current trend, eventually this technology may be able to be paired with the word-completion technologies used by smartphones to improve potential typing speed. 

Brain-Computer Interface (BCI), also known as mind-machine interface (MMI), direct neural interface (DNI), or brain-machine interface (BMI), is a technological system that allows individuals to communicate using their brain waves. BCI research began in the 1970’s at the University of California, Los Angeles; the first human models were employed in the 1990’s. There are various BCI systems, which all work differently. The general idea is that the computer responds to changes in brain waves. This can be in response to different letters, movement imagery, calculations, or specific movement sensations.

There are two kinds of BCI: invasive and noninvasive. Invasive BCIs are surgically implanted into the gray matter of the brain and produce the highest quality signals. The disadvantage of invasive BCIS is their susceptibility to scar-tissue build-up which can weaken or eliminate signals. Noninvasive, or EEG-based BCIs, comprise the majority of BCI technology and are wearable via electrodes on the exterior of the skill. However, these signals are often dampened and blurred by the skull. Non-EEG, noninvasive BCI techniques also are being developed to function based on cover interest in oscillating letters displayed on a virtual keyboard in accordance with the user’s pupil oscillation. Functionality on one such device is enhanced by the mental rehearsal of words such as “bright” and “dark” in synchrony with the brightness of the letters on the keyboard.

BCI is a step up from eye-tracking technology, which has served many individuals with various forms of paralysis, but most commonly ALS. ALS, or Amyotrophic Lateral Sclerosis, (also called Lou Gehrig’s Disease), is a terminal condition that is characterized by the gradual onset of progressive muscle weakness as well as other potential muscle afflictions. 

Eye-tracking works by tracking the movement of the eyes or head as a measure of electro-oculographic potential using electrodes placed on the head or a camera imaging system. The device is able to track the eyes as they move across a keyboard using the eyes to select and click on letters, pop-ups, and even select an option that allows the typed language to be read out loud.   Brain-Computer Interface, sometimes referred to as neuroprosthetics, is an upgrade from eye-tracking technology in that this non-visual brain-computer interface is functional in end-users who may have drooping eyelids or limited gaze control. Stephen Hawking, for example, was unable to use eye-tracking software due to his drooping eyelids. In addition, many other people find eye-tracking technology tiring. Brain tracking technologies make communication possible for such end-user individuals.

Krishna Shenoy, one of the leading researchers of the Brain Machine Initiative at Stanford claims that “brain-controlled prosthesis will lead to a substantial improvement in quality of life.” The new brain-controlled prosthesis is designed to continuously correct brain readings to allow people with paralysis to communicate.The device taps into relevant regions of the brain and delivers thought commands to devices (such as virtual keypads) while bypassing damaged connections. 

In an experiment at the Stanford Neuroscience Institute, Rhesus monkeys were able to transcribe passages from the New York Times and Hamlet at a rate of 12 words per minute. Using a multi-electrode array implanted in the brain and a high-performance algorithm, researchers set a performance baseline to measure how many targets monkeys could tap using their fingers in 30 seconds. The experimental phase consisted of a typing task measuring how many virtual taps the monkeys could generate using a brain-controlled cursor in a given amount of time. The taps were elicited by showing monkeys a sequence of dots overlaid on a Metropolis keyboard (not visible to the monkeys). Monkeys were then prompted with target dots lit up in green which they were trained to navigate and select (using rewards). They were prompted with a dwell-typing task where the monkeys had to hover their cursor over the indicated button, as well as a click-typing task based on cursor velocity and a discrete click found to be much more fast-paced and strenuous. Over the course of the experimental session, participants typed out entire articles from the New York Times.

Monkey J was best at the tasks and achieved an average typing rate of 10 words per minute in the dwell typing category and 12 words per minute in the click typing category. The average experimental score was 26 thought-taps in 30 seconds, 90% of the efficacy of the 29 finger-tap score. These results cannot be directly converted or quantified into terms of human competency since the experiment did not account for the cognitive load of word and sentence formation of humans compared to monkeys.

In another recent experiment this year at Duke University, a monkey was able to successfully drive a motorized wheelchair using BCI. The monkey was seated in the chair and hooked up to an intracranial implant that monitored his brain waves. It signaled motor commands based on the primates’ large-scale electrical brain activity during a training period where the animals passively navigated the task: moving the wheelchair towards a bowl of grapes.

BCIs are not generally available for commercial use as of yet, however, the intendiX from g.tec is the only model available for commercial purchase. Most research institutions are still running trials on their BCIs so as to obtain the utmost speed, efficiency, and functionality for users. Brain-controlled prostheses currently work by accessing only a few hundred of the 100 billion neurons in the human brain. However, the motor commands being processed require the use of millions of neurons. “These brain dynamics are analogous to rules that characterize the interactions of the millions of neurons that control motions,” says Jonathon Kao, a contributor to the study and doctoral student at Stanford University. The primary current objective in the development of this device is to make it more precise by tweaking measured signals such that the sample’s dynamics are more like baseline brain dynamics.

The potential for this innovation is huge. If Brain-Machine Interface development continues on its current trend, it could mean a way for individuals with spinal cord injuries and paralysis to move and communicate in previously impossible ways. BCIs hold the potential for paralyzed individuals to walk using a thought-controlled exoskeleton, for amputees to have control over robotic limbs using a neural prosthesis, for individuals with aphasia and other speech-inhibiting conditions to communicate for the first time in their lives, and more. BCI exemplifies everything science is, taking the first step in a new field to render the impossible possible.

Posts created 80

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top