On some interrelations of generalized q-entropies and a generalized Fisher information, including a Cramér-Rao inequality
Résumé
In this communication, we describe some interrelations between generalized q-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information naturally pops up in the expression of the derivative of the Tsallis entropy. This generalized Fisher information also appears as a special case of a generalized Fisher information for estimation problems. Indeed, we derive here a new Cramér-Rao inequality for the estimation of a parameter, which involves a generalized form of Fisher information. This generalized Fisher information reduces to the standard Fisher information as a particular case. In the case of a translation parameter, the general Cramér-Rao inequality leads to an inequality for distributions which is saturated by generalized q-Gaussian distributions. These generalized q-Gaussians are important in several areas of physics and mathematics. They are known to maximize the q-entropies subject to a moment constraint. The Cramér-Rao inequality shows that the generalized q-Gaussians also minimize the generalized Fisher information among distributions with a fixed moment. Similarly, the generalized q-Gaussians also minimize the generalized Fisher information among distributions with a given q-entropy.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...