Scientists convert brain signals into words using AI
EEG-based system could help speech-impaired patients communicate via thoughts.

Australian scientists have developed an AI model that converts brainwaves into spoken words and sentences using a wearable EEG cap.
The system, created at the University of Technology Sydney, marks a significant step in communication technology and cognitive care.
The deep learning model, designed by Daniel Leong, Charles Zhou, and Chin-Teng Lin, currently works with a limited vocabulary but has achieved around 75% accuracy. Researchers aim to improve this to 90% by expanding training data and refining brainwave analysis.
Bioelectronics expert Mohit Shivdasani noted that AI now detects neural patterns previously hidden from human interpretation. Future uses include real-time thought-to-text interfaces or direct communication between people via brain signals.
However, breakthrough opens new possibilities for patients with speech or movement impairments, pointing to future human-machine interaction that bypasses traditional input methods.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!