Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/136271
Title: TOWARDS SCALABLE GRADIENT-BASED HYPERPARAMETER OPTIMIZATION IN DEEP NEURAL NETWORKS
Authors: FU JIE
Keywords: hyperparameter optimization, deep neural networks
Issue Date: 19-Aug-2016
Source: FU JIE (2016-08-19). TOWARDS SCALABLE GRADIENT-BASED HYPERPARAMETER OPTIMIZATION IN DEEP NEURAL NETWORKS. ScholarBank@NUS Repository.
Abstract: It is well-known that the performance of large-sized deep neural networks (DNNs) is sensitive to the setting of their hyperparameters. Hyperparameter optimization is thus recognized as a crucial step in the process of applying DNNs to achieve best performance and drive industrial applications. The works described in this thesis represent the first forays into the scalable gradient-based methods for elementary- and hyper-parameter optimization in DNNs in a unified manner.
URI: http://scholarbank.nus.edu.sg/handle/10635/136271
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
FuJ.pdf1.52 MBAdobe PDF

OPEN

NoneView/Download

Page view(s)

80
checked on Jan 14, 2018

Download(s)

551
checked on Jan 14, 2018

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.