Dynamics of eigenvectors of random matrices and eigenvalues of nonlinear models of matrices.

Authors
Publication date
2019
Publication type
Thesis
Summary This thesis consists of two independent parts. The first part concerns the study of the eigenvectors of random Wigner matrices. First, we study the distribution of the eigenvectors of deformed Wigner matrices, which consist in a perturbation of a Wigner matrix by a deterministic diagonal matrix. If the two matrices are of the same order of magnitude, it has been proved that the eigenvectors delocalize completely and the eigenvalues enter the Wigner-Dyson-Mehta universality class. We study here an intermediate phase where the deterministic perturbation dominates the randomness: the eigenvectors are not completely delocalized while the eigenvalues remain universal. The eigenvector entries are asymptotically Gaussian with a variance that localizes them in an explicit part of the spectrum. Moreover, their mass is concentrated around this variance in the sense of a unique quantum ergodicity. Then, we study correlations of different eigenvectors. To do so, a new observable on the eigenvector moments of the Dyson Brownian motion is studied. It follows a closed parabolic equation which is a fermionic counterpart of the Bourgade-Yau eigenmomentum flow. By combining the study of these two observables, it is possible to analyze some correlations.The second part concerns the study of the distribution of eigenvalues of nonlinear models of random matrices. These models appear in the study of random neural networks and correspond to a non-linear version of covariance matrix in the sense that a non-linear function, called activation function, is applied input by input on the matrix. The eigenvalue distribution converges to a deterministic distribution characterized by a self-consistent equation of degree 4 on its Stieltjes transform. The distribution depends on the function on only two explicit parameters and for some choices of parameters we find the Marchenko-Pastur distribution which remains stable after passing under several layers of the neural network.
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr