Author

Debayan Datta

ORCID Identifier(s)

0000-0002-1188-0838

Graduation Semester and Year

2022

Language

English

Document Type

Thesis

Degree Name

Master of Science in Computer Science

Department

Computer Science and Engineering

First Advisor

Manfred Huber

Abstract

Gesture Control as a way to replace more conventional remote-control operations has been pursued for a significant period of time with different levels of success. The use of gestures to control different types of human interfaces is today predominantly seen in the multimedia sector. People perform easy and intuitive gestures to control their televisions, to interact with multimedia, and to play games. Also, much research has been carried out to experiment with human interfaces to numerous augmented and virtual reality devices and tasks. The results of these experiments were so exciting that researchers started to expand the use of gestures to control physical real-life objects using gestures. One such gesture control platform is controlling an Unmanned Aerial Vehicle (UAV) during flight using hand movements and other types of significant gestures. While the use of gestures as an intuitive way to control such a platform is very promising, it also brings with it a number of challenges and risks that need to be addressed. The main challenges here arise from the versatility and universal character of gestures which make them more susceptible to misinterpretations in terms of user intentions in particular in the context of real-world distractions which can lead to the unintentional generation of gestures outside the context of the control task. The scope of hand gestures is so varied that predicting the gesture using a designated framework could, at a point, be risky for the UAV, where a non-gesture movement possibly generated in the context of natural or artificial distraction could be classified as a control signature and cause a dramatic failure of the UAV. For instance, if a human subject entitled to control the UAV using their hand movements is subjected to an unanticipated situation that made him busy to perform any other similar looking gesture, and the framework misinterprets it to be a known gesture, the result of executing the unintended command might end the life of that drone. In the work presented here, we have mainly focused on an instance where the human subject was considered to be subjected to different types of distraction while controlling the UAV using hand gestures. The UAV controls are suspended to avoid any self or collateral damage whenever any distraction from the brain or hand movements is identified. Finally, once distractions abate, the UAV shifts to ready mode in order to receive new commands. The experimentation is performed using wearable Electromyography (EMG) Sensor armbands and non-invasive Electroencephalography (EEG) Sensors worn on the head. The Armband is co-fitted with both EMG and Inertial Measurement Unit (IMU) Sensors responsible for extracting the muscle movement and the motion-sensing data. The EEG Sensor is used to get the brain data to identify the distracted brain sequences during the operation. An intuitive gesture set for drone operation is designed and a neural network pipeline was set up which was used to identify the gesture performed as well as the current state of the brain. The pipeline constituted a Long-Short Term Memory (LSTM) network classifying the gestures and an Anomaly Detector, which identified periods of operation that correspond to distractions where the human was likely no longer focusing on UAV operations. The LSTM was able to classify 99.52% of the gesture set correctly and the anomaly detector had an accuracy of 90.0006%. The precision of the distraction classifier was 91.86% and the recall was 93.89% for a dataset having nearly 30% of distracted data samples. The signals were recorded from a single human subject over 3 minutes intervals and 20 repetitions with a rest period of 5-10 minutes to avoid brain and muscle fatigue.

Keywords

Robotics, Bio-sensing, Human-activity data, Artificial intelligence, Control system, Gestures recognition, Distraction detection, Intention recognition, Neural network, lstm, Anomaly detector

Disciplines

Computer Sciences | Physical Sciences and Mathematics

Comments

Degree granted by The University of Texas at Arlington

30419-2.zip (19465 kB)

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.