Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/186064
Title: DEEP NEURAL NETWORK APPROXIMATION VIA FUNCTION COMPOSITIONS
Authors: ZHANG SHIJUN
ORCID iD:   orcid.org/0000-0003-4115-7891
Keywords: Function Composition, Deep Neural Network, Approximation Theory, Floor or ReLU Activation Function, Exponential Convergence, Polynomial Approximation
Issue Date: 4-Aug-2020
Citation: ZHANG SHIJUN (2020-08-04). DEEP NEURAL NETWORK APPROXIMATION VIA FUNCTION COMPOSITIONS. ScholarBank@NUS Repository.
Abstract: Deep neural networks have made significant impacts in many fields of computer science and engineering, especially for large-scale and high-dimensional learning problems. This thesis focuses on the approximation theory of deep neural networks. We provide (nearly optimal) approximation error estimates in terms of the width and depth when constructing ReLU networks, via the idea of function compositions, to uniformly approximate polynomials, continuous functions, and smooth functions on a hypercube. The optimality of the approximation error estimates is discussed via connecting the approximation property to VC-dimension. Finally, we introduce a new class of networks built with either Floor (the floor function) or ReLU as the activation function in each neuron, which provides a much better approximation error than that of ReLU networks.
URI: https://scholarbank.nus.edu.sg/handle/10635/186064
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
ZhangShijun_Thesis_2020.pdf1.26 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.