Graduation Semester and Year

2018

Language

English

Document Type

Thesis

Degree Name

Master of Science in Computer Science

Department

Computer Science and Engineering

First Advisor

Vassilis Athitsos

Abstract

Deep Neural Network have become very popular for computer vision application in recent years. At the same time, it remains important to understand the different implementation choices that need to be made when designing a neural network and to thoroughly investigate existing and novel alternatives for those choices. One of those choices is the activation function. The ReLU activation function is a widely used activation function. It discards all the values below zero and keeps the ones greater than zero. Variations such as Leaky ReLU and Parametric ReLU do not discard values, so that gradiants are nonzero for the entire input range. However, one or both scaling parameters are implicitly or explicitly hardcoded. We are proposing a new variation of ReLU, that we call Double-Weighted Rectifier Liner Unit (DWReLU), in which both scaling parameters are trainable. In our experiment, on popular bench mark datasets (MINIST and CIFAR-10), the praposed activation function leads to better accuracy most of the time, compared to other activation function.

Keywords

Deep neural networks, Activation functions, Machine learning

Disciplines

Computer Sciences | Physical Sciences and Mathematics

Comments

Degree granted by The University of Texas at Arlington

27810-2.zip (679 kB)

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.