For more details, see Imagining a new interface: Hands-free communication without saying a word | Tech@Facebook; also see Brain-computer interfaces are developing faster than the policy debate around them | The Verge
"Chevillet’s optimism is fueled in large part by a first in the field of brain-computer interfaces that hit the presses this morning: In the journal Nature Communications, a team at the University of California, San Francisco, funded by Facebook Reality Labs, has built a brain-computer interface that accurately decodes dialogue—words and phrases both heard and spoken by the person wearing the device—from brain signals in real time.Facebook Closer to Augmented Reality Glasses With Brain Implant That Decodes Dialogue From Neural Activity | IEEE Spectrum
The results are an important step toward neural implants could be used to restore natural communication to patients who have lost the ability to speak due to stroke, spinal cord injury, or other conditions, says senior author and UCSF neurosurgeon Edward Chang.
Facebook, however, is more interested in building augmented reality glasses than biomedical devices. This work provides a proof of principle that it is possible to decode imagined speech from brain signals by measuring the activity of large populations of neurons, says Chevillet. “This [result] helps set the specification of what type of a wearable device we need to build.”"