Please use this identifier to cite or link to this item:
|Title:||Incremental learning with temporary memory|
|Authors:||Jain, S. |
Moelius III, S.E.
|Citation:||Jain, S., Lange, S., Moelius III, S.E., Zilles, S. (2010). Incremental learning with temporary memory. Theoretical Computer Science 411 (29-30) : 2757-2772. ScholarBank@NUS Repository. https://doi.org/10.1016/j.tcs.2010.04.010|
|Abstract:||In the inductive inference framework of learning in the limit, a variation of the bounded example memory (Bem) language learning model is considered. Intuitively, the new model constrains the learner's memory not only in how much data may be stored, but also in how long those data may be stored without being refreshed. More specifically, the model requires that, if the learner commits an example x to memory, and x is not presented to the learner again thereafter, then eventually the learner forgetsx, i.e., eventually x no longer appears in the learner's memory. This model is called temporary example memory (Tem) learning. Many interesting results concerning the Tem-learning model are presented. For example, there exists a class of languages that can be identified by memorizing k + 1 examples in the Tem sense, but that cannot be identified by memorizing k examples in the Bem sense. On the other hand, there exists a class of languages that can be identified by memorizing just one example in the Bem sense, but that cannot be identified by memorizing any number of examples in the Tem sense. Results are also presented concerning the special case of learning classes of infinite languages. © 2010 Elsevier B.V. All rights reserved.|
|Source Title:||Theoretical Computer Science|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on May 23, 2018
WEB OF SCIENCETM
checked on May 8, 2018
checked on May 12, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.