A fast intracortical brain–machine interface with patterned optogenetic feedback
Abstract
The development of brain-machine interfaces (BMIs) brings new prospects to patients with a loss of autonomy. By combining online recordings of brain activity with a decoding algorithm, patients can learn to control a robotic arm in order to perform simple actions. However, in contrast to the vast amounts of somatosensory information channeled by limbs to the brain, current BMIs are devoid of touch and force sensors. Patients must therefore rely solely on vision and audition, which are maladapted to the control of a prosthesis. In contrast, in a healthy limb, somatosensory inputs alone can efficiently guide the handling of a fragile object, or ensure a smooth trajectory. We have developed a BMI in the mouse that includes a rich artificial somatosensory-like cortical feedback.