Abstract
A brain machine interface (BMI) for visually guided grasping would provide significant benefits for paralyzed patients, given the crucial role these movements play in our everyday life. We have developed a BMI to decode grasp shape in real-time in macaque monkeys. Neural activity was evaluated using chronically implanted elec-trodes in the anterior intraparietal cortex (AIP) and ventral premotor cortex (F5), areas that are known to be in-volved in the transformation of visual signals into hand grasping instructions. In a first study, we decoded two grasp types (power and precision grip) and three grasp orientations (target oriented vertically or tilted left or right) from the neural activity during movement planning with an accuracy of about 70%. These results are proof-of-concept for a BMI for visually guided grasping that could be extended for larger number of grip types and grip orientations, as needed for prosthetic applications in humans.