Finite Sample Improvement of Akaike's Information Criterion - Archive ouverte HAL
Article Dans Une Revue IEEE Transactions on Information Theory Année : 2021

Finite Sample Improvement of Akaike's Information Criterion

Résumé

Considering the selection of frequency histograms, we propose a modification of Akaike's Information Criterion that avoids overfitting, even when the sample size is small. We call this correction an over-penalization procedure. We emphasize that the principle of unbiased risk estimation for model selection can indeed be improved by addressing excess risk deviations in the design of the penalization procedure. On the theoretical side, we prove sharp oracle inequalities for the Kullback-Leibler divergence. These inequalities are valid with positive probability for any sample size and include the estimation of unbounded log-densities. Along the proofs, we derive several analytical lemmas related to the Kullback-Leibler divergence, as well as concentration inequalities, that are of independent interest. In a simulation study, we also demonstrate state-of-theart performance of our over-penalization criterion for bin size selection, in particular outperforming AICc procedure.
Fichier principal
Vignette du fichier
SauNav_MLE_final.pdf (2.04 Mo) Télécharger le fichier
SauNav_MLE_Supp.pdf (412.34 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03286369 , version 1 (14-07-2021)

Identifiants

Citer

Adrien Saumard, Fabien Navarro. Finite Sample Improvement of Akaike's Information Criterion. IEEE Transactions on Information Theory, 2021, 67 (10), ⟨10.1109/TIT.2021.3094770⟩. ⟨hal-03286369⟩
137 Consultations
176 Téléchargements

Altmetric

Partager

More