Please use this identifier to cite or link to this item:
|Title:||Vacillatory and BC learning on noisy data|
|Citation:||Case, J.,Jain, S.,Stephan, F. (2000). Vacillatory and BC learning on noisy data. Theoretical Computer Science 241 (1-2) : 115-141. ScholarBank@NUS Repository.|
|Abstract:||The present work employs a model of noise introduced earlier by the third author. In this model noisy data nonetheless uniquely determines the true data: correct information occurs infinitely often while incorrect information occurs only finitely often. The present paper considers the effects of this form of noise on vacillatory and behaviorally correct learning of grammars - both from positive data alone and from informant (positive and negative data). For learning from informant, the noise, in effect, destroys negative data. Various noisy-data hierarchies are exhibited, which, in some cases, are known to collapse when there is no noise. Noisy behaviorally correct learning is shown to obey a very strong "subset principle". It is shown, in many cases, how much power is needed to overcome the effects of noise. For example, the best we can do to simulate, in the presence of noise, the noise-free, no mind change cases takes infinitely many mind changes. One technical result is proved by a priority argument. © 2000 Elsevier Science B.V. All rights reserved.|
|Source Title:||Theoretical Computer Science|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 8, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.