Please use this identifier to cite or link to this item:
https://doi.org/10.1016/S0893-6080(99)00022-2
Title: | Text compression via alphabet re-representation | Authors: | Long, P.M. Natsev, A.I. Vitter, J.S. |
Keywords: | Alphabet re-representation Machine learning Neural networks Over-fitting Text compression |
Issue Date: | 1999 | Citation: | Long, P.M., Natsev, A.I., Vitter, J.S. (1999). Text compression via alphabet re-representation. Neural Networks 12 (4-5) : 755-765. ScholarBank@NUS Repository. https://doi.org/10.1016/S0893-6080(99)00022-2 | Abstract: | This article introduces the concept of alphabet re-representation in the context of text compression. We consider re-representing the alphabet so that a representation of a character reflects its properties as a predictor of future text. This enables us to use an estimator from a restricted class to map contexts to predictions of upcoming characters. We describe an algorithm that uses this idea in conjunction with neural networks. The performance of our implementation is compared to other compression methods, such as UNIX compress, gzip, PPMC, and an alternative neural network approach. | Source Title: | Neural Networks | URI: | http://scholarbank.nus.edu.sg/handle/10635/39304 | ISSN: | 08936080 | DOI: | 10.1016/S0893-6080(99)00022-2 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.