Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/166275
Title: | REGULARIZATION ON MACHINE LEARNING | Authors: | LIANG SENWEI | Keywords: | deep learning, regularization, generalization, overfitting, Drop-Activation | Issue Date: | 18-Dec-2019 | Citation: | LIANG SENWEI (2019-12-18). REGULARIZATION ON MACHINE LEARNING. ScholarBank@NUS Repository. | Abstract: | Deep neural networks have become a powerful tool for machine learning problems. However, overfitting frequently occurs. To achieve better generalization, many regularization methods were proposed to reduce overfitting. In this thesis, we propose a simple-yet-effective regularization method called Drop-Activation. At the training phase, we drop nonlinear activation functions randomly and set them to be identity functions. At the testing phase, a deterministic network with a new activation function is used and the new activation function is designed to average effect of the randomness of discarding activations. We theoretically deduce the implicit regularization terms of Drop-Activation and the effect of Drop-Activation can be considered as implicit parameter reduction. Also, our theoretical analysis verifies its capability to be used together with Batch Normalization (Ioffe and Szegedy 2015). We perform Drop-Activation on the benchmark datasets and show that the performance of popular networks can be improved generally by Drop-Activation. | URI: | https://scholarbank.nus.edu.sg/handle/10635/166275 |
Appears in Collections: | Master's Theses (Open) |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
LiangSW.pdf | 771.09 kB | Adobe PDF | OPEN | None | View/Download |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.