Mathematical inequalities for some divergences
Some natural phenomena are deviating from standard statistical behavior and their study has increased interest in obtaining new definitions of information measures. But the steps for deriving the best definition of the entropy of a given dynamical system remain unknown. In this paper, we introduce some parametric extended divergences combining Jeffreys divergence and Tsallis entropy defined by generalized logarithmic functions, which lead to new inequalities. In addition, we give lower bounds for one-parameter extended Fermi–Dirac and Bose–Einstein divergences. Finally, we establish some inequalities for the Tsallis entropy, the Tsallis relative entropy and some divergences by the use of the Young’s inequality.
Year of publication: |
2012
|
---|---|
Authors: | Furuichi, Shigeru ; Mitroi, Flavia-Corina |
Published in: |
Physica A: Statistical Mechanics and its Applications. - Elsevier, ISSN 0378-4371. - Vol. 391.2012, 1, p. 388-400
|
Publisher: |
Elsevier |
Subject: | Mathematical inequality | Tsallis relative entropy | Jeffreys divergence | Jensen–Shannon divergence | Fermi–Dirac divergence | Bose–Einstein divergence and quasilinear divergence |
Saved in:
Online Resource
Saved in favorites
Similar items by subject
-
Sinha, Hariom Sharan, (2018)
-
Bounds for Jeffreys–Tsallis and Jensen–Shannon–Tsallis divergences
Popescu, P.G., (2014)
-
Generalized relative entropies in the classical limit
Kowalski, A.M., (2015)
- More ...