Large deviations for the largest eigenvalue of the sum of two random matrices
Abstract
In this paper, we consider the addition of two matrices in generic position, namely A + U BU * , where U is drawn under the Haar measure on the unitary or the orthogonal group. We show that, under mild conditions on the empirical spectral measures of the deterministic matrices A and B, the law of the largest eigenvalue satisfies a large deviation principle, in the scale N, with an explicit rate function involving the limit of spherical integrals. We cover in particular all the cases when A and B have no outliers.
Domains
Probability [math.PR]Origin | Files produced by the author(s) |
---|
Loading...