Link to the University of Pittsburgh Homepage
Link to the University Library System Homepage Link to the Contact Us Form

Real-Time Somatosensory Feedback for Neural Prosthesis Control: System Development and Experimental Validation

Bacher, Daniel H (2009) Real-Time Somatosensory Feedback for Neural Prosthesis Control: System Development and Experimental Validation. Master's Thesis, University of Pittsburgh. (Unpublished)

[img]
Preview
PDF
Primary Text

Download (11MB) | Preview

Abstract

Recent advances in neural prosthetics have provided patients with the ability to use signals derived from motor areas of the cerebral cortex to directly control an external device under visually guided closed-loop control. To attain a more natural form of prosthesis control, it is desirable to develop systems capable of providing real-time somatosensory feedback as well as visual feedback, akin to how we naturally process sensory information to control our limbs. To this end, a sophisticated data acquisition, control and feedback system was developed for neural prosthetics and psychophysics research. The system deterministically collects and processes high volume neural ensemble activity, limb kinematics, and eye movements while generating visual stimuli in an immersive three-dimensional virtual reality (VR) environment. A vibrotactile feedback device was also developed and incorporated into the system. It delivers real-time limb kinematics feedback in the form of continuous, graded vibratory stimulation. A flexible and intuitive user interface allows the researcher to design experimental paradigms and adjust parameters on the fly during experiments. A psychophysical study was conducted using this system to evaluate the potential use of vibrotactile feedback as a sensory substitution method to provide somatosensory feedback for neural prosthesis control. The study also aimed to provide insight into the mechanisms of multimodal sensory processing and sensory-motor control. Able-bodied human subjects performed a trajectory-following reach task in the VR environment under different visual and vibrotactile feedback conditions. The study showed that vibrotactile feedback is capable of enhancing motor performance, implying that subjects were able to integrate and effectively use this new 'proprioceptive-like' sensory modality. Subjects were also able to partially maintain task performance using vibrotactile feedback in the absence of visual feedback. Improved motor learning and motor skill consolidation were also observed after training in the VR environment with vibrotactile feedback. These results suggest that vibrotactile feedback may be a viable method for delivering somatosensory feedback for applications such as neural prosthesis control, motor rehabilitation, and enhanced human-computer interaction.


Share

Citation/Export:
Social Networking:
Share |

Details

Item Type: University of Pittsburgh ETD
Status: Unpublished
Creators/Authors:
CreatorsEmailPitt UsernameORCID
Bacher, Daniel Hdhb10@pitt.eduDHB10
ETD Committee:
TitleMemberEmail AddressPitt UsernameORCID
Committee ChairBatista, Aaronapb10@pitt.eduAPB10
Committee MemberWeber, Douglasdjw50@pitt.eduDJW50
Committee MemberStetten, Georgestetten@andrew.cmu.edu
Committee MemberKlatzky, Robertaklatzky@cmu.edu
Date: 25 September 2009
Date Type: Completion
Defense Date: 16 July 2009
Approval Date: 25 September 2009
Submission Date: 23 July 2009
Access Restriction: 5 year -- Restrict access to University of Pittsburgh for a period of 5 years.
Institution: University of Pittsburgh
Schools and Programs: Swanson School of Engineering > Bioengineering
Degree: MSBeng - Master of Science in Bioengineering
Thesis Type: Master's Thesis
Refereed: Yes
Uncontrolled Keywords: brain-computer interface; brain-machine interface; graphics; haptics; labview; software
Other ID: http://etd.library.pitt.edu/ETD/available/etd-07232009-133548/, etd-07232009-133548
Date Deposited: 10 Nov 2011 19:53
Last Modified: 15 Nov 2016 13:46
URI: http://d-scholarship.pitt.edu/id/eprint/8544

Metrics

Monthly Views for the past 3 years

Plum Analytics


Actions (login required)

View Item View Item