Link to the University of Pittsburgh Homepage
Link to the University Library System Homepage Link to the Contact Us Form

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

Downey, JE and Weiss, JM and Muelling, K and Venkatraman, A and Valois, JS and Hebert, M and Bagnell, JA and Schwartz, AB and Collinger, JL (2016) Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping. Journal of NeuroEngineering and Rehabilitation, 13 (1).

[img]
Preview
PDF
Published Version
Available under License : See the attached license file.

Download (1MB) | Preview
[img] Plain Text (licence)
Available under License : See the attached license file.

Download (1kB)

Abstract

© 2016 Downey et al. Background: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. Methods: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. Results: Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. Conclusions: Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users. Trial registration: NCT01364480 and NCT01894802.


Share

Citation/Export:
Social Networking:
Share |

Details

Item Type: Article
Status: Published
Creators/Authors:
CreatorsEmailPitt UsernameORCID
Downey, JEjed92@pitt.eduJED92
Weiss, JMjmw182@pitt.eduJMW1820000-0003-1332-674X
Muelling, K
Venkatraman, A
Valois, JS
Hebert, M
Bagnell, JA
Schwartz, ABabs21@pitt.eduABS21
Collinger, JLcollingr@pitt.eduCOLLINGR
Date: 1 January 2016
Date Type: Publication
Journal or Publication Title: Journal of NeuroEngineering and Rehabilitation
Volume: 13
Number: 1
DOI or Unique Handle: 10.1186/s12984-016-0134-9
Schools and Programs: School of Medicine > Neurobiology
School of Medicine > Physical Medicine and Rehabilitation
Swanson School of Engineering > Bioengineering
Refereed: Yes
Date Deposited: 23 Aug 2016 13:33
Last Modified: 28 Oct 2017 11:55
URI: http://d-scholarship.pitt.edu/id/eprint/28699

Metrics

Monthly Views for the past 3 years

Plum Analytics

Altmetric.com


Actions (login required)

View Item View Item