Please use this identifier to cite or link to this item:
|Title:||Program size restrictions in computational learning|
|Authors:||Jain, S. |
|Citation:||Jain, S.,Sharma, A. (1994-05-23). Program size restrictions in computational learning. Theoretical Computer Science 127 (2) : 351-386. ScholarBank@NUS Repository.|
|Abstract:||A model for a subject S learning its environment E could be described thus: S, placed in E, receives data about E, and simultaneously conjectures a sequence of hypotheses. S is said to learn E just in case the sequence of hypotheses conjectured by S stabilizes to a final hypothesis which correctly represents E. Computational learning theory provides a framework for studying problems of this nature when the subject is a machine. A natural abstraction for the notion of hypothesis is a computer program. The present paper, in the above framework of learning, presents arguments for the final hypothesis to be succinct, and introduces a plethora of formulations of such succinctness. A revelation of this study is that some of the "natural" notions of succinctness may be uninteresting because learning capability of machines under these seemingly natural constraints is dependent on the choice of programming system used to interpret hypotheses. © 1994.|
|Source Title:||Theoretical Computer Science|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Nov 9, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.