Author

Oguz Yetkin

ORCID Identifier(s)

0000-0001-8602-2339

Graduation Semester and Year

2016

Language

English

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Biomedical Engineering

Department

Bioengineering

First Advisor

Dan Popa

Second Advisor

Georgios Alexandrakis

Third Advisor

Vassilis Athitsos

Fourth Advisor

Young-Tae Kim

Abstract

Modern robotic prosthetic devices for upper limb amputees promise to alleviate an important disability, but are underutilized due to the inability to properly control them. Specifically, the devices afford more degrees of freedom (DOFs) than are con- trollable by easily decoded biological signals. These devices, such as the DEKA arm, can have as many as 18 DOFs, although six is a more typical number (control of each finger plus thumb rotation). Unfortunately, the use of these devices remains limited by the ability of users to simultaneously control more than one degree of freedom at a time with commercially deployed technology. Control of robotic prosthetic devices is typically achieved through electromyogram (EMG) signals read from the residual limb. While several groups have reported being able to use multiple EMG sensors to classify the user intent from residual mus- cle activity, such systems have not proven robust enough to translate to clinical use and are not intuitive. In the first part of this research, the prosthetic control problem is re-framed as a Human Robot Interface problem, developing and clinically evaluating several robotic interface methods which can eliminate or complement the use of EMG signals while allowing the user to quickly achieve more grasping patterns, thus allowing the use of all the DOFs available in the prosthetic device. Three healthy limb based methods have been developed and evaluated, including: 1) the use of the healthy hand to tele- operate the prosthetic device via a Mirroring Glove, 2) the use of the healthy hand to issue pre-programmed commands to the prosthetic device via a Gesture Glove and 3) the use of the healthy hand with extremely light fingernail worn devices to issue commands to the prosthetic device. In the second part of this research, a field-deployable and easy way of training a multiple input based EMG classifier is presented and extended to using Force Myography (FMG) data fused with EMG data. Overall, a number of different experiments were conducted with a total of 20 human subjects, including 2 amputees, and the following conclusions were reached: 1) Healthy limb based prosthetic device control can match the performance speed of EMG based control with very little training 2) Gesture based control of the healthy limb is faster than mirrored teleoperation except in the case of tasks which are mirrored by their nature 3) Bilateral hand movements combined with kinematic tracking of the healthy limb can be utilized to train a Force Myography (FMG) based classifier as well as an EMG based classifier, and that the combination of the two modalities hold promise to make a readily deployable multi-DOF EMG/FMG classifier system a reality.

Keywords

Prosthetic device control, Fingernail based gesture tracker, Gesture tracking, Prosthetics, Electromyography, Force myography, EMG, FMG, Multi degree of freedom prosthetic device, Neural networks

Disciplines

Biomedical Engineering and Bioengineering | Engineering

Comments

Degree granted by The University of Texas at Arlington

Share

COinS