On the convergence of the extremal eigenvalues of empirical covariance matrices with dependence - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Probability Theory and Related Fields Année : 2018

On the convergence of the extremal eigenvalues of empirical covariance matrices with dependence

Résumé

Consider a sample of a centered random vector with unit covariance matrix. We show that under certain regularity assumptions, and up to a natural scaling, the smallest and the largest eigenvalues of the empirical covariance matrix converge, when the dimension and the sample size both tend to infinity, to the left and right edges of the Marchenko--Pastur distribution. The assumptions are related to tails of norms of orthogonal projections. They cover isotropic log-concave random vectors as well as random vectors with i.i.d. coordinates with almost optimal moment conditions. The method is a refinement of the rank one update approach used by Srivastava and Vershynin to produce non-asymptotic quantitative estimates. In other words we provide a new proof of the Bai and Yin theorem using basic tools from probability theory and linear algebra, together with a new extension of this theorem to random matrices with dependent entries.

Dates et versions

hal-01196046 , version 1 (09-09-2015)

Identifiants

Citer

Konstantin Tikhomirov, Djalil Chafai. On the convergence of the extremal eigenvalues of empirical covariance matrices with dependence. Probability Theory and Related Fields, 2018, 170 (3), pp.847-889. ⟨10.1007/s00440-017-0778-9⟩. ⟨hal-01196046⟩
64 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More