Graduation Semester and Year

2021

Language

English

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Computer Science

Department

Computer Science and Engineering

First Advisor

Fillia Makedon

Abstract

Human-Machine Interaction (HMI) can be defined as a way for us to communicate with machines through user interfaces. User interfaces have evolved from complicated punch cards and levers in the first analog computers to a more natural way of interaction using speech or gestures in today's digital assistants. Technological advancements in computing devices have paved the way for smart, powerful computers to be part of our everyday lives. There is also an increasing trend of using smart computing devices and robots in manufacturing lines, medical procedures, rehabilitation, and personal care. The umbrella of HMI typically covers several areas like Human-Robot Interaction (HRI) and Human-Computer Interaction (HCI), but a new paradigm of Human-Robot Collaboration (HRC) is required to cover the growing research in collaborative robots. Collaborative robots or cobots are used where humans and robots work together as a team to achieve a common goal. Such a setup requires the robot system to understand several aspects of the human partner's behavior, including their physical and mental states, based on the area of application. Advancements in wearable sensors, artificial intelligence, and robotics have made these collaborative systems smart, personalizable, and safe. Despite the abundance of research in this field, there is a lack of research to understand the different human factors, such as human behavior and cognition, to create better HRC systems. The central focus of this work is to advance research in the field of human factors for HRC. It revolves in two axes that first explore the different cognitive and behavioral assessment systems and finally exploit the domain expertise gained to build a cognitive assessment system that simulates a real-world task. Different intelligent cognitive assessment systems are built that are capable of using physiological data to predict a specific cognitive ability effectively. Sensors such as Electroencephalogram (EEG), Electrocardiogram (ECG), Electrodermal Activity (EDA), and RGB cameras have been used to assess the user's state. Subsequently, physiological sensors are used in an industrial collaborative assembly scenario to predict user performance and cognitive load to enhance HRC. A collaborative system is built using advanced HMI concepts to simulate a real-world scenario and collect data from human subjects. Several data, including system-specific performance metrics and multimodal sensor data, are collected to perform a data-driven evaluation of the developed HRC system for cognitive load prediction.

Keywords

Human robot collaboration, Machine learning, Human factors, Physiological sensors

Disciplines

Computer Sciences | Physical Sciences and Mathematics

Comments

Degree granted by The University of Texas at Arlington

Share

COinS