Almost 15 years after being paralysed by a stroke, a 58-year-old US-american woman was once again able to serve herself a drink of coffee - thanks to a state-of-the-art DLR robot arm and hand that she controlled using her own neural signals sensed directly from her brain. It took just a few moments for her to grasp the drinking bottle with the robot hand, bring it up to her mouth and drink her coffee through a straw. To accomplish this act, software decoded neural signals recorded from a small array of electrodes that reflected her intention to reach and grasp and converted them into commands that directed the robot arm and hand. Researchers from TUM and DLR present the results of their collaboration with the Brown University, the United States Department of Veterans Affairs, and Massachusetts General Hospital in the May 17, 2012 issue of the journal Nature.

It is April 12, 2011; the BrainGate trial participant tracks the movements of the light-weight DLR robot with a look of concentration. As she imagines movement of her own arm, her brain sends the associated signals to a computer via a four by four millimetre sensor. Surgeons had implanted the sensor more than five years earlier in the motor area of her cerebral cortex, a thin sheet of neurons on the outside of the brain that creates motor commands. The computer decodes the signals, and the DLR robot arm and five-fingered hand execute these decoded instructions, taking the place of her own paralysed arm and enabling her to drink on her own for the first time since her stroke.

Following a brainstem stroke almost 15 years ago, she lost the ability to speak, and was rendered unable to make useful movement except for her head and eyes. When she brought the straw to her mouth for the first time in more than a decade, she smiled.

To maximise safety, the robot uses sensors to continuously check for unintended contact with its surroundings. In case of unexpected contact, the integrated control intervenes, relaxing the robot in a few thousands of a second and rendering it forceless so that it stops with a gentle touch. The system precisely controls the force of the robot hand’s grip and the speed of the light-weight arm's movements while it is being positioned by brain signals. This is the first time scientists have coupled the brain to a complex robot able to reproduce a human arm’s capabilities.

This research demonstrated that, even in people who were paralysed for years, neural signals remained functional at a level where they can be used, for example, to move robotic limbs.

At DLR, the following three authors contributed to this paper:

Sami Haddadin
Picture of  Jörn Vogel

Jörn Vogel

DLR: PhD candidate
BCI robot control
joern.vogeldlrde, +49 8153 28-2166
Picture of  Patrick van der Smagt

Patrick van der Smagt

current: Head of AI Research, data lab, VW Group

Previous: Director of BRML labs
fortiss, An-Institut der Technischen Universität München
Professor for Biomimetic Robotics and Machine Learning, TUM

Chairman of Assistenzrobotik e.V.

Jörn Vogel, Sami Haddadin, John D Simeral, Sergey D Stavisky, Daniel Bacher, Leigh R Hochberg, John P Donoghue, Patrick van der Smagt (2014). Continuous Control of the DLR Light-weight Robot III by a human with tetraplegia using the BrainGate2 Neural Interface System. In Oussama Khatib and Vijay Kumar and Gaurav Sukhatme (Eds.) Experimental Robotics 79 125-136.
Liu J, Simeral JD, Stavisky SD, Bacher D, Vogel J, Haddadin S, Smagt P van der, Hochberg LR, Donoghue JP (2010). Control of a robotic arm using intracortical motor signal by an individual with tetraplegia in the BrainGate2 trial. 40th Annual Meeting in Neuroscience (SFN2010)
Vogel J, Haddadin S, Simeral J D, Stavisky S D, Bacher D, Hochberg L R, Donoghue J P, Smagt P van der (2010). Continuous Control of the DLR Light-weight Robot III by a human with tetraplegia using the BrainGate2 Neural Interface System. International Symposium on Experimental Robotics (ISER)