Graduation Semester and Year
2019
Language
English
Document Type
Dissertation
Degree Name
Doctor of Philosophy in Electrical Engineering
Department
Electrical Engineering
First Advisor
Michael T Manry
Abstract
Training methods for both shallow and deep neural nets are dominated by first order algorithms related to back propagation and conjugate gradient. However, these methods lack affine invariance so performance is damaged by nonzero input means, dependent inputs, dependent hidden units and the use of only one learning factor. This dissertation reviews affine invariance and shows how MLP training can be made partially affine invariant when Newton's method is used to train small numbers of MLP parameters. Several novel methods are proposed for scalable partially affine invariant MLP training. The potential application of the algorithm to deep learning is discussed. Ten-fold testing errors for several datasets show that the proposed algorithm outperforms back propagation and conjugate gradient, and that it scales far better than Levenberg-Marquardt.
Keywords
Back propagation, Vanishing gradient, Balanced gradient
Disciplines
Electrical and Computer Engineering | Engineering
License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Recommended Citation
Nguyen, Son Nam, "AFFINE INVARIANCE IN MULTILAYER PERCEPTRON TRAINING" (2019). Electrical Engineering Dissertations. 368.
https://mavmatrix.uta.edu/electricaleng_dissertations/368
Comments
Degree granted by The University of Texas at Arlington