Entropies and entropic criteria - Archive ouverte HAL Accéder directement au contenu
Chapitre D'ouvrage Année : 2015

Entropies and entropic criteria

Résumé

This chapter focuses on the notions of entropy and of maximum entropy distribution which will be characterized according to different perspectives. Beyondlinks with applications in engineering and physics, it will beshown that it is possible to build regularization functionals based on the use of a maximum entropy technique, which canthen possibly be employed as \emph{ad hoc} potentials in data inversion problems.The chapter begins with an overview of the key propertiesof information measures, and with the introduction of various conceptsand definitions. In particular, the R\'enyi divergence is defined,the concept of escort\index{escort distribution} distribution is presented, and the principleof maximum entropy that will be subsequently used will be commented on. A conventional engineering problem is then presented, the problem of source\index{source coding} coding, and it shows the benefit of using measures with adifferent length than the standard measure, and in particular an exponential measure,which leads to a source\index{source coding} coding theorem whose minimum boundis a R\'enyi entropy. It is also shown that optimal codes can be easily calculated with escort\index{escort distribution} distributions. InSection~\ref{sec:Un-mod=00003D0000E8le-simple}, a simple state transition model is introduced and examined. This model leads toan equilibrium distribution defined as a generalized escort\index{escort distribution} distribution,and as a by-product leads once again to a R\'enyi\index{entropy!R\'enyi@R\'enyi} entropy. The Fisher information flow along thecurve defined by the generalized escort\index{escort distribution} distribution is examined andconnections with the Jeffreys divergence are achieved. Finally,various arguments are obtained which, in this framework, lead to aninference method based on the minimization of the R\'enyi entropy undera generalized mean constraint, that is to say, taken with regardto the escort\index{escort distribution} distribution. From subsection~\ref{sub:Fonctionnelles-entropiques-issue},the main concern is about the minimization of the R\'enyi divergence subject toa generalized average constraint. The optimal densitythat solves this problem, and the value of the correspondingoptimal divergence are given and characterized. The mainproperties of any entropy that may be related are defined and characterized. Finally, it is shownhow to practically calculate these entropies and how it can be envisaged to use them for solving linear problems.
Fichier principal
Vignette du fichier
chap11.pdf (504.23 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01087579 , version 1 (26-11-2014)

Identifiants

  • HAL Id : hal-01087579 , version 1

Citer

Jean-François Bercher. Entropies and entropic criteria. Jean-François Giovannelli; Jérôme Idier. Inversion methods applied to signal and image processing, Wiley, pp.26, 2015. ⟨hal-01087579⟩
216 Consultations
262 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More