Graduation Semester and Year
2019
Language
English
Document Type
Dissertation
Degree Name
Doctor of Philosophy in Computer Engineering
Department
Computer Science and Engineering
First Advisor
Fillia Makedon
Abstract
As computers become more advanced, affordable, and smaller in size, we start to use them in almost every aspect of our daily life. Nowadays, the use of computers is not just limited to accomplish work-related tasks. Instead, we use computers for education, entertainment, healthcare, and in many other areas to facilitate our daily life activities. From here, the Human-Computer Interaction (HCI) field emerged. HCI is a multidisciplinary field of study that focuses on utilizing computers and technology to interact with humans, improve their quality of life, and enhance their performance. The rapid advancements in other related research fields, such as robotics, artificial intelligence, and sensor technologies, have tremendously improved human-computer interaction applications and made it more personalized, adaptive, and smarter. This research explores innovative HCI applications using robotics, sensors, and wearable technologies to monitor and assess human cognitive and physical abilities. The systems collect, analyze, and evaluate multimodal data, which include system-specific metrics and data from non-invasive sensors. The sensors used in this research include electromyography (EMG), electroencephalogram (EEG), electrocardiogram (ECG), electrodermal activity (EDA), pulse oximeter, inertial measurement unit (IMU), eye-trackers and cameras. The analysis and evaluation of these multimodal data are done using statistical analysis and machine learning techniques. Primarily, this dissertation discusses gamified robot-assisted assessment and rehabilitation, the impact of sleep quality on cognitive performance, the importance of understanding human behavior and physiology to provide adaptive and personalized training, and the impact of cognitive workload on human physical performance. The outcome of this research is a multimodal cognitive and physical assessment platform, called "9PM" that stands for 9-Peg Moves. The platform combines a simple physical task based on the principles of a standard upper extremity test, with other standard cognitive tests to assess user cognitive and physical performance and understand the correlation between users' performance and their physiological and behavioral responses.
Keywords
HCI, HRI, Machine learning, Sensor technology, Assessment, Rehabilitation
Disciplines
Computer Sciences | Physical Sciences and Mathematics
License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Recommended Citation
Abujelala, Maher, "THINK2ACT: USING MULTIMODAL DATA TO ASSESS HUMAN COGNITIVE AND PHYSICAL PERFORMANCE" (2019). Computer Science and Engineering Dissertations. 298.
https://mavmatrix.uta.edu/cse_dissertations/298
Comments
Degree granted by The University of Texas at Arlington