On false discovery rate thresholding for classification under sparsity
Résumé
We study the properties of false discovery rate (FDR) thresholding, viewed as a classification procedure. The ''$0$''-class (null) is assumed to have a known density while the ''$1$''-class (alternative) is obtained from the ''$0$''-class either by translation or by scaling. Furthermore, the ''$1$''-class is assumed to have a small number of elements w.r.t. the ''$0$''-class (sparsity). We focus on densities of the Subbotin family, including Gaussian and Laplace models. Non-asymptotic oracle inequalities are derived for the excess risk of FDR thresholding. These inequalities lead to explicit rates of convergence of the excess risk to zero, as the number $m$ of items to be classified tends to infinity and in a regime where the power of the Bayes rule is away from $0$ and $1$. Moreover, these theoretical investigations suggest an explicit choice for the target level $\alpha_m$ of FDR thresholding, as a function of $m$. Our oracle inequalities show theoretically that the resulting FDR thresholding adapts to the unknown sparsity regime contained in the data. This property is illustrated with numerical experiments.
Origine | Fichiers produits par l'(les) auteur(s) |
---|