Please use this identifier to cite or link to this item: https://doi.org/10.1006/jcss.2000.1741
Title: Improved bounds on the sample complexity of learning
Authors: Li, Y. 
Long, P.M. 
Srinivasan, A.
Keywords: Agnostic learning
Empirical process theory
Machine learning
PAC learning
Sample complexity
Issue Date: 2001
Citation: Li, Y., Long, P.M., Srinivasan, A. (2001). Improved bounds on the sample complexity of learning. Journal of Computer and System Sciences 62 (3) : 516-527. ScholarBank@NUS Repository. https://doi.org/10.1006/jcss.2000.1741
Abstract: We present a new general upper bound on the number of examples required to estimate all of the expectations of a set of random variables uniformly well. The quality of the estimates is measured using a variant of the relative error proposed by Haussler and Pollard. We also show that our bound is within a constant factor of the best possible. Our upper bound implies improved bounds on the sample complexity of learning according to Haussler's decision theoretic model.
Source Title: Journal of Computer and System Sciences
URI: http://scholarbank.nus.edu.sg/handle/10635/43067
ISSN: 00220000
DOI: 10.1006/jcss.2000.1741
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.