The musculo-jaw interface reads the movements of small muscles, recognizes those that correspond to the subvocalization of individual words, and responds with a voice that is broadcast directly into the user's ear, and sends commands voiced "inward voice" to other devices.
A group of engineers from MIT developed the AlterEgo, which hears how you think. The earpiece with the "tentacles", the ends of which lie on the chin of the user, reads the weak movements of the muscles of the face and brain, correlating with the mental pronunciation (subvocalization) of individual words, and translates the information into a radio signal, through which it controls devices - smartphones, voice assistants or such same headphones of other users.
With the user AlterEgo communicates with the help of vibrations, which are transmitted to the ear by the bones of the skull and are perceived as a sound signal; "Thinking" the word "time", the user sends a request, the response is sent directly to the ears of the user and is not heard by others. A detailed description of the device can be found here.
While the device perceives only a small set of mental commands - numbers and monosyllables. Accuracy of recognition has been brought up to 92% and is expected to be increased, giving the machine learning algorithm more material for analysis. This will require more volunteers to test. The number of electrodes in contact with the face was reduced from 16 to 4 and, possibly, it could be reduced to one. The creators of the device believe that AlterEgo can replace the neurointerfaces while they are at the development stage.