Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/247638
Title: | PRUNING NEURAL NETWORKS USING DETERMINANTAL POINT PROCESSES | Authors: | VLADIMIR PETROVIC | ORCID iD: | orcid.org/0009-0003-3366-9093 | Keywords: | Determinantal Point Processes, Neural Networks, Theory of Machine Learning | Issue Date: | 28-Jan-2024 | Citation: | VLADIMIR PETROVIC (2024-01-28). PRUNING NEURAL NETWORKS USING DETERMINANTAL POINT PROCESSES. ScholarBank@NUS Repository. | Abstract: | Neural network pruning aims at lowering the complexity of a query to a neural network which is already trained. In this scope, various strategies can be imagined in order to find a sub network, which performs nearly as optimally as the main one, without having to relearn a second time. In this master thesis, we tackle the issue of neural network pruning via determinantal point processes (DPP). The goal of this work is to provide some theoretical guarantees on the empirical work from Mariet and Sra from 2016. To find theoretical guarantees, we first give a general approach on DPPs. Then, we analyse neural network pruning via the lens of the coresets in the most general setting. Finally, we generalize an approach coming from statistical mechanics to better understand the generalization error in a neural network pruned with a DPP in the noiseless teacher/student framework. | URI: | https://scholarbank.nus.edu.sg/handle/10635/247638 |
Appears in Collections: | Master's Theses (Open) |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
VladimirPetrovic.pdf | 1.27 MB | Adobe PDF | OPEN | None | View/Download |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.