Please use this identifier to cite or link to this item: https://doi.org/10.3390/e13111945
Title: A characterization of entropy in terms of information loss
Authors: Baez, J.C 
Fritz, T
Leinster, T
Issue Date: 2011
Publisher: MDPI AG
Citation: Baez, J.C, Fritz, T, Leinster, T (2011). A characterization of entropy in terms of information loss. Entropy 13 (11) : 1945-1957. ScholarBank@NUS Repository. https://doi.org/10.3390/e13111945
Rights: Attribution 4.0 International
Abstract: There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the "information loss", or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well. © 2011 by the authors.
Source Title: Entropy
URI: https://scholarbank.nus.edu.sg/handle/10635/180974
ISSN: 1099-4300
DOI: 10.3390/e13111945
Rights: Attribution 4.0 International
Appears in Collections:Staff Publications
Elements

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
10_3390_e13111945.pdf245.53 kBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons