Wearable brain-computer interface pairs EEG with AI for robotic control
Engineers at UCLA unveiled an AI-assisted brain-computer interface that enables robotic arm control without invasive surgery.
UCLA engineers have developed a wearable brain-computer interface that utilises AI to interpret intent, allowing for the control of robotic arms and computer cursors.
The non-invasive system uses electroencephalography (EEG) to decode brain signals and combines them with an AI camera platform for real-time assistance. The results, published in ‘Nature Machine Intelligence’, demonstrate significant performance improvements over traditional BCIs.
Participants tested the device on two tasks: moving a cursor across a computer screen and directing a robotic arm to reposition blocks. All completed tasks faster with AI assistance, while a paralysed participant, unable to finish without support, succeeded in under seven minutes.
Researchers emphasise the importance of safety and accessibility. Unlike surgically implanted BCIs, which remain confined to limited clinical trials, the wearable device avoids neurosurgical risks while offering new independence for people with paralysis or ALS.
Future development will focus on making AI ‘co-pilots’ more adaptive, allowing robotic arms to move with greater precision, dexterity, and task awareness.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!