Graduation Semester and Year
Spring 2025
Language
English
Document Type
Thesis
Degree Name
Master of Science in Computer Science
Department
Computer Science and Engineering
First Advisor
Dr Nicholas R. Gans
Second Advisor
Dr Manfred Huber
Third Advisor
Dr Chris Dale McMurrough
Abstract
This paper presents our preliminary study on enabling individuals with visual impairments to safely operate mobile robots and vehicles. To achieve this, we developed a teleoperation with accessibility at its core. The system incorporates features that enhance usability and situational awareness, including assistive control based on artificial potential fields to prevent collisions and ensure smooth navigation. It also provides multimodal feedback through (a) haptic vibrations on the gamepad controller, which convey the proximity of nearby objects detected by the robot’s laser sensor, and (b) color-coded overlays that differentiate paths, obstacles, and people through semantic segmentation performed by a deep neural network on the robot’s camera feed. To evaluate its effectiveness, we partnered with the Austin Lighthouse to conduct experiments in which legally blind participants used the system to successfully guide the robot through a testing area with obstacles
Keywords
Robotics, visually impaired, teleoperated, virtual fixtures
Disciplines
Robotics
License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Recommended Citation
Thamaraiselvan, Vishwaak Chandran, "VIRTUAL FIXTURES FOR TELEOPERATED ROBOTS FOR THE VISUALLY IMPAIRED" (2025). Computer Science and Engineering Theses. 526.
https://mavmatrix.uta.edu/cse_theses/526