Author

Tien Pham

ORCID Identifier(s)

0000-0001-5632-9209

Graduation Semester and Year

2023

Language

English

Document Type

Thesis

Degree Name

Master of Science in Computer Science

Department

Computer Science and Engineering

First Advisor

Manfred Huber

Abstract

Human-Computer Interfaces (HCI) is an essential aspect of modern technology that has revolutionized the way we interact with machines. With the revolution of computers and smart devices and the advent of autonomous vehicles and other machines, there has been a significant advancement in this area that brings convenience to users to interact with technology intuitively and efficiently. However, the importance of HCI goes beyond the convenience of everyday technology. It has become crucial in the development of assistive technologies that empower people with disabilities to live more independently. Person with disabilities, who lack control of one or more parts of their physical body or who have mental limitation have to interact with the machine in a special often very custom way that match their individual capabilities. One common machine that many people with severe physical disabilities have to interact with every day is the wheelchair which has been used for decades to facilitate their lives. While many times the common simple interfaces which are usually available on wheelchairs, such as joysticks or sip and puff interfaces, are sufficient, they are difficult to use for persons with severe disabilities who do not have proper control of their hands or are inconvenient and hard to utilize. This need as well as the quest for more intuitive, less overhead control leads to research for other ways to interact with a wheelchair in the context of partially autonomous navigation. In this thesis, a context-aware gaze-based interface is developed to allow users to control the wheelchair naturally without translating the user's eye gaze input to specific commands. The system can estimate eye gaze directions and analyze the location users are looking at to obtain the context for inference of user navigation intention. A navigation detection model is also embedded into the system that can distinguish between users' navigation intention, navigation-related attention or non-navigation attention to serve as a driver of semi-autonomous smart wheelchair systems.

Keywords

HCI, Smart wheel chair, Gaze estimation, Machine learning

Disciplines

Computer Sciences | Physical Sciences and Mathematics

Comments

Degree granted by The University of Texas at Arlington

31269-2.zip (2830 kB)

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.