MIT's alterego headset can hear and transcribe the voice in your head

MIT's alterego headset can hear and transcribe the voice in your head

alterego is a AI intelligent headset developed by MIT media lab that picks up on neuromuscular signals that trigger when you subvocalize – that’s talking in your head in layman’s terms. the system consists of a wearable device and an associated computing system that can transcribe words that the user verbalizes internally but does not speak aloud.


electrodes in the device pick up neuromuscular signals in the jaw and face that are triggered by internal verbalizations — saying words ‘in your head’ — but are undetectable to the human eye. the signals are then fed to a machine-learning system that has been trained to correlate particular signals with particular words.


video by MIT media lab



the idea seems like pure science fiction but in fact, internal verbalizations and their physical correlations s have been a serious thought and study since the 19th centure. one of the goals of the speed-reading movement of the 1960s was to eliminate internal verbalization, or ‘subvocalization,’ as it’s known.


the motivation for this was to build an IA device — an intelligence-augmentation device,’ says arnav kapur, a graduate student at the MIT media lab, who led the development of the new system.our idea was: could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?

  • @Paul – The device reads “intended” subvocalized speech i.e. when you read words in your head but don’t vocalize. It does not read your thoughts. In fact reading thoughts is noise and from what I understand, the researchers went ahead with this modality to avoid reading the noise and capture voluntary intentions.

    @Laura – This is still in early stages of development, but hopefully will reach a level when it starts make a more tangible impact and help people who have undergone strokes etc to “speak”

  • My son had a stroke is now non-verbal – where & when can I get this???

  • So…remote cameras can already translate the vibration of houseplant leaves into conversations in a distant apartment. Looks like a short hop, using this tech, to reading our unexpressed thoughts next?

    Paul d'Orléans

have something to add? share your thoughts in our comments section below.
all comments are reviewed for the purposes of moderation before publishing.

comments policy


a diverse digital database that acts as a valuable guide in gaining insight and information about a product directly from the manufacturer, and serves as a rich reference point in developing a project or scheme.

technology news

keep up with our daily and weekly stories
508,300 subscribers
- see sample
- see sample
designboom magazine