ORCID Identifier(s)

0000-0002-7713-5507

Graduation Semester and Year

Fall 2024

Language

English

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Mechanical Engineering

Department

Mechanical and Aerospace Engineering

First Advisor

Dr. Panos S. Shiakolas

Second Advisor

Dr. Ashfaq Adnan

Third Advisor

Dr. Christopher McMurrough

Fourth Advisor

Dr. Ioannis D. Schizas

Fifth Advisor

Dr. Kamesh Subbarao

Abstract

Object interaction is accomplished using a bi-directional communication channel allowing a person to act and adapt while forming a sensory perception due to naturally available meaningful feedback. Individuals with disabilities might rely on assistive devices adapted to user abilities, such as wheelchairs and robots receiving input through brain-computer interface (BCI), to perform object interactions. A user might find interaction using an assistive device challenging if a bi-directional communication channel is not formed due to meaningful feedback being absent. Also, the user is not always part of the decision-making process after providing input and relies on device-generated feedback or visually monitoring the device. The absence of bi-directional communication could make the device operation taxing when using non-invasive tools such as BCI which could be misinterpreted. This research aims to develop an operational architecture for human-robot interaction (HRI) using vibrotactile feedback for bi-directional communication with an assistive robotic device. The research is organized into four focus areas: architecture to control a robotic system, architecture to control an anthropomorphic robotic hand, methodology for authoring vibrotactile feedback, and comparison of predefined and personalized feedback for HRI. An operational architecture that classifies actions to be performed by a robot based on user inputs and verifies the action using vibrotactile feedback before execution is proposed for bi-directional communication. A methodology utilizing trigonometric functions and waveform generators is proposed to generate vibrotactile feedback. Custom hardware platforms for HRI were conceptualized, developed, and utilized to evaluate the architecture and the feedback generation methodology through participant-based evaluations. The analysis of the results showed that participants could follow the operational architecture and establish a bidirectional communication channel. The participants could successfully identify vibrotactile feedback, recall feedback-associated information, and author personalized feedback. The analysis of the personalized feedback showed that their characteristics can be defined as opposite, sequential, or distinct. When compared with predefined feedback, personalized feedback improved feedback recognition and reduced mental demand and effort required. These results provide confidence for further research in non-verbal communication and the application of machine learning for HRI as well as the evaluation of the system with persons with special needs.

Keywords

Human robot interaction, Brain computer interface, Assistive robotics, Vibrotactile feedback, Grasp training, Anthropomorphic robotic hand, Human machine interaction, User interaction, Action verification, Personalized haptic feedback

Disciplines

Cognition and Perception | Cognitive Science | Electro-Mechanical Systems

Available for download on Wednesday, December 16, 2026

Share

COinS