Satpute, Shantanu
(2019)
FingerSight: A Vibrotactile Wearable Ring to Help the Blind Locate and Reach Objects in Peripersonal Space.
Master's Thesis, University of Pittsburgh.
(Unpublished)
Abstract
Visually impaired people need a solution to compensate for the lack of visual information. Although assistive technologies exist to help them navigate through the environment, blind people rely on groping to locate and reach objects in peripersonal space. FingerSight solves this problem using visual-to-tactile substitution. Our prototype consists of four haptic tactors embedded into a ring worn on the index finger, with a tiny camera mounted on top. The camera image is processed using computer vision to control haptic feedback to the user. The four tactors are evenly spaced around the finger. Users are instructed to move their hand towards the target by vibrating the tactor in that direction, guiding their motion until they reach the target. At that point, all tactors vibrate simultaneously.
Two experiments were conducted on normally-sighted participants, to test the functionality of our prototype. The first revealed that participants could discriminate between the five different haptic stimulations with a mean accuracy of 89.4%, which improved with additional training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to reach one of four Light Emitting Diodes (LEDs) mounted on a cardboard sheet within arm’s reach. Infrared markers mounted on the device enabled its location to be recorded by an optical tracker. A computer vision algorithm located the LED in the camera image and controlled the tactors using two different strategies: (1) Worst Axis First and (2) Adjacent Tactor Pair. Results revealed that participants could follow the haptic instructions to reach the target with similar accuracy for both strategies, but that the time to reach the target was significantly different.
Using control systems analysis, a closed loop proportional-integral-derivative (PID) controller and plant was simulated. A model for the plant was computed on the experimental data using autoregressive-moving average with exogenous terms (ARMAX), with the human subject acting as the plant. The control system was then optimized to find the best strategy for tactor activation, laying the groundwork for a future generation of FingerSight.
Share
Citation/Export: |
|
Social Networking: |
|
Details
Item Type: |
University of Pittsburgh ETD
|
Status: |
Unpublished |
Creators/Authors: |
|
ETD Committee: |
|
Date: |
18 June 2019 |
Date Type: |
Publication |
Defense Date: |
29 March 2019 |
Approval Date: |
18 June 2019 |
Submission Date: |
2 April 2019 |
Access Restriction: |
No restriction; Release the ETD for access worldwide immediately. |
Number of Pages: |
85 |
Institution: |
University of Pittsburgh |
Schools and Programs: |
Swanson School of Engineering > Bioengineering |
Degree: |
MS - Master of Science |
Thesis Type: |
Master's Thesis |
Refereed: |
Yes |
Uncontrolled Keywords: |
Haptics, Perception |
Date Deposited: |
18 Jun 2019 15:40 |
Last Modified: |
18 Jun 2019 15:40 |
URI: |
http://d-scholarship.pitt.edu/id/eprint/36227 |
Metrics
Monthly Views for the past 3 years
Plum Analytics
Actions (login required)
 |
View Item |