- AiNews.com
- Posts
- Brain2Qwerty: AI Translates Brain Signals into Text Without Surgery
Brain2Qwerty: AI Translates Brain Signals into Text Without Surgery

Image Source: ChatGPT-4o
Brain2Qwerty: AI Translates Brain Signals into Text Without Surgery
Researchers from Meta AI have introduced Brain2Qwerty, a deep learning model designed to decode brain signals into text using non-invasive brain activity recordings. Unlike traditional brain-computer interfaces (BCIs) that require surgical implants, this approach uses electroencephalography (EEG) and magnetoencephalography (MEG) to analyze how the brain processes typing movements—laying the groundwork for a future where people could type using only their thoughts.
How It Works
Participants did not type for communication—instead, they typed while their brain activity was recorded, helping the AI learn which brain signals correspond to each key press.
Brain activity was recorded using EEG or MEG, non-invasive methods that track electrical and magnetic activity in the brain.
The Brain2Qwerty model, a combination of convolutional networks, transformers, and a pretrained language model, analyzed these signals and attempted to reconstruct what the participant was typing based solely on their brain activity.
Key Findings
Brain2Qwerty Performance:
With MEG, the model achieved an average character-error-rate (CER) of 32%, significantly outperforming EEG (CER: 67%).
The best-performing participants reached a CER of 19%, meaning they could produce highly accurate sentences using only brain signals.
The system outperforms standard non-invasive methods and helps bridge the gap between invasive and non-invasive BCIs.
Insights & Challenges
While MEG produced a much clearer signal than EEG, decoding accuracy varied across participants—with some achieving much higher accuracy than others.
Typing errors and keyboard layout affected decoding performance, suggesting that both motor functions (finger movement) and cognitive functions (word selection and intent) play a role.
The system does not yet work in real-time and currently requires participants to type for training before it can generate text purely from brain activity.
The model was tested on healthy participants, meaning additional adaptations are needed for non-verbal or paralyzed individuals—the group it’s ultimately designed to help.
How It Compares to Elon Musk’s Neuralink
Unlike Elon Musk’s Neuralink, which requires surgically implanted electrodes to interface directly with the brain, Brain2Qwerty achieves sentence decoding using completely external methods like EEG and MEG. While Neuralink currently offers higher accuracy and real-time processing, it comes with surgical risks and accessibility challenges. In contrast, Brain2Qwerty represents a safer, non-invasive alternative that, with further improvements, could provide a viable communication tool for patients with paralysis or speech impairments without requiring brain surgery.
What This Means
This research marks a major step forward for non-invasive brain-computer interfaces. While invasive devices like Neuralink still lead in accuracy and real-time processing, advances in AI-powered decoding models like Brain2Qwerty could eventually bridge the gap—offering scalable, non-surgical communication tools for people with neurological conditions, like speech or movement impairments. As technology progresses, we may see a future where BCIs become widely accessible without the need for brain implants, opening new doors for communication, accessibility, and human-computer interaction.
Looking ahead, the next generation of AI research tools will likely blend these strengths—offering faster, more accurate, and deeply contextual insights that transform how individuals and businesses gather, analyze, and apply knowledge across industries.
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.