A precise bare simulation approach to the minimization of some distances, 1. Foundations
Résumé
In information theory-as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition-many flexibilizations of the omnipresent Kullback-Leibler information distance (relative entropy) and of the closely related Shannon entropy have become frequently used tools. To tackle corresponding constrained minimization (respectively maximization) problems by a newly developed dimension-free bare (pure) simulation method, is the main goal of this paper. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). As a side effect, we also derive an innovative way of constructing new useful distances/divergences. To illustrate the core of our approach, we present numerous examples. The potential for widespread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences and entropies in various different research fields (which may also serve as an interdisciplinary interface).
Mots clés
Ali-Silvey-Morimoto type
power divergences
Kullback-Leibler information distance
relative entropy
Renyi divergences
Bhattacharyya distance
Jensen-Shannon divergence/distance
alpha-divergences
Shannon entropy
Renyi entropies
Bhattacharyya coefficient
Tsallis (cross) entropies
Cressie-Read measures
Hellinger distance
Euclidean norms
generalized maximum entropy method
importance sampling
fuzzy divergences
Domaines
Statistiques [math.ST]
Fichier principal
001_Broniatowski_Stummer_2021-07-04_HAL-Version-1.pdf (1.09 Mo)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|