Researchers use ultrasound and machine learning to decode and predict movement intentions in brain

Researchers at the California Institute of Technology (Caltech) have developed a new type of brain-machine interface (BMI) that uses functional ultrasound technology (fUS) to read out brain activity corresponding to the planning of movement. Unlike more invasive BMIs (which involve, for instance, electrodes implanted into the brain), the new system can record and map brain activity from precise regions deep within the brain, without damaging brain tissue. The technology was developed with the aid of non-human primates, who were taught to do simple tasks that involved moving their eyes or arms. While the primates were completing these tasks, the fUS measured brain activity in the region of the brain involved in planning movement. The ultrasound imaging data and the corresponding tasks were then processed by a machine learning algorithm which learned what patterns of brain activity correlated with which tasks. Once trained, the algorithm was presented with ultrasound data collected in real time from the primates. The algorithm managed to predict, within a few seconds, what behaviour the primate was going to carry out (eye movement or reach), direction of the movement (left or right), and when they planned to make the movement. The research is seen as a critical step in the development of neuro-recording and BMI tools that are less invasive, high resolution, and scalable.