Limiting behavior of eigenvectors of large dimensional random matrices

147 189 0
Limiting behavior of eigenvectors of large dimensional random matrices

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

LIMITING BEHAVIOR OF EIGENVECTORS OF LARGE DIMENSIONAL RANDOM MATRICES XIA NINGNING NATIONAL UNIVERSITY OF SINGAPORE 2013 LIMITING BEHAVIOR OF EIGENVECTORS OF LARGE DIMENSIONAL RANDOM MATRICES XIA NINGNING (B.Sc. Bohai University of China) A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF STATISTICS AND APPLIED PROBABILITY NATIONAL UNIVERSITY OF SINGAPORE 2013 ii ACKNOWLEDGEMENTS I would like to express my deep and sincere gratitude to my supervisor, Professor Bai Zhidong. His valuable guidance and continuous support are crucial to the completion of this thesis. He is truly a great mentor not only in statistics but also in daily life. I have learned many things from him, especially regarding academic research and character building. Next, I would like to thank Assistant Professor Pan Guangming and Qin Yingli for discussion on various topics in research. I also thank all my friends who helped me to make life easier as a graduate student. Finally, I wish to express my gratitude to the university and the department for supporting me through NUS Graduate Research Scholarship. iii CONTENTS Acknowledgements ii Summary vi List of Notations Chapter Introduction 1.1 Large Dimensional Random Matrices . . . . . . . . . . . . . . . . . viii 1 1.1.1 Spectral Analysis . . . . . . . . . . . . . . . . . . . . . . . . 1.1.2 Eigenvector . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.1 Moment Method . . . . . . . . . . . . . . . . . . . . . . . . 1.2.2 Stieltjes Transform . . . . . . . . . . . . . . . . . . . . . . . CONTENTS 1.2.3 iv Organization of the Thesis . . . . . . . . . . . . . . . . . . . Chapter Literature Review for Sample Covariance Matrices 2.1 Spectral Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 15 15 2.1.1 Limiting Spectral Distribution . . . . . . . . . . . . . . . . . 16 2.1.2 Limits of Extreme Eigenvalues . . . . . . . . . . . . . . . . . 20 2.1.3 Convergence Rate . . . . . . . . . . . . . . . . . . . . . . . . 22 2.1.4 CLT of Linear Spectral Statistics . . . . . . . . . . . . . . . 23 2.2 Eigenvector Properties . . . . . . . . . . . . . . . . . . . . . . . . . 28 Chapter Convergence Rate of VESD for Sample Covariance Matrices 32 3.1 Introduction and Main Result . . . . . . . . . . . . . . . . . . . . . 32 3.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.1.2 Main theorems . . . . . . . . . . . . . . . . . . . . . . . . . 38 3.2 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.2.1 Stieltjes transform . . . . . . . . . . . . . . . . . . . . . . . 41 3.2.2 Inequalities for the distance between distributions via Stieltjes transforms . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3.3 Preliminary Formulae . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3.4 Proofs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.4.1 Truncation and Normalization . . . . . . . . . . . . . . . . . 49 3.4.2 Proof of Theorem 3.1 . . . . . . . . . . . . . . . . . . . . . . 53 3.4.3 Proof of Theorem 3.2 . . . . . . . . . . . . . . . . . . . . . . 68 3.4.4 Proof of Theorem 3.3 . . . . . . . . . . . . . . . . . . . . . . 69 3.4.5 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 CONTENTS v Chapter Functional CLT of Eigenvectors for Sample Covariance Matrices 83 4.1 Introduction and Main Result . . . . . . . . . . . . . . . . . . . . . 83 4.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.1.2 Main Result . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.2 Bernstein Polynomial Strategy . . . . . . . . . . . . . . . . . . . . . 90 4.3 Proof . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 4.3.1 Convergence of ∇1 − E∇1 . . . . . . . . . . . . . . . . . . . 93 4.3.2 Mean Function . . . . . . . . . . . . . . . . . . . . . . . . . 121 4.3.3 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 Chapter Conclusion and Future Research 130 5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 5.2 Future Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Bibliography 133 vi SUMMARY All classical limiting theorems in multivariate statistical analysis assume that the number of variables is fixed and the sample size is much larger than the dimension of the data. In the light of the rapid development of computer science, we are dealing with large dimensional data in most cases. Moving from low dimensional to large dimensional problems, random matrices theory (RMT) as an efficient approach has received much attention and developed significantly. The original statistical issues in multivariate analysis have changed to the investigation on limiting properties of eigenvalues and eigenvectors for large dimensional random matrices in RMT. Based on the observation that the limiting spectral properties of large dimensional sample covariance matrix are asymptotically distribution free and the fact Summary that the matrix of eigenvectors (eigenmatrix) of the Wishart matrix is Haar distributed over the group of unitary matrices, it is conjectured that the behavior of eigenmatrix of a large sample covariance matrix should asymptotically perform as Haar distributed under some moment conditions. Thus, the thesis is concerned on finding the limiting behavior of eigenvectors of large sample covariance matrices. The main work in this thesis involves two parts. In the first part (Chapter 3), to investigate the limiting behavior of eigenvectors of a large sample covariance matrix, we define the eigenVector Empirical Spectral Distribution (VESD) with weights defined by eigenvectors and establish three types of convergence rates of the VESD when data dimension n and sample size N proportionally tend to infinity. In the second part (Chapter 4), the limiting behavior of eigenvectors of sample covariance matrices is further discussed. Using Bernstein polynomial approximation and results obtained in Chapter 3, we prove the central limit theorem for the linear spectral statistics associated with the VESD, indexed by a set of functions with continuous second order derivatives. This result provides us a strong evidence to conjecture that the eigenmatrix of large sample covariance matrices is asymptotically Haar distributed. Thus, based on the result in Chapter 4, we have a better view of the asymptotic property of eigenvectors for large general random matrices, such as Wigner matrices. vii viii LIST Of NOTATIONS Xn (Xij )n×N = (X1 , · · · , XN ) Sn (1/N )Xn X∗n , the simplified sample covariance matrix Sn (1/N )X∗n Xn , the companion matrix of Sn xn ∥xn ∥ = 1, unit vector in space Cn cn n/N → c, the dimension to sample size ratio index Fc (x), Fcn (x) M-P law with ratio c and cn , F Sn (x), F S n (x) the empirical spectral distribution (ESD) of Sn , Sn F cn (x) (1 − cn )I(0,∞) (x) + cn Fcn (x) H Sn (x) the eigenvector empirical spectral distribution (VESD) of Sn List of Notations ix mn (z), mH n (z) the Stieltjes transform of F Sn (x) and H Sn (x) m0n (z), m(z) the Stieltjes transform of Fcn (x) and Fc (x) mn (z) the Stieltjes transform of F S n (x) m0n (z), m(z) the Stieltjes transform of F cn (x) and F c (x) U open interval including the support of M-P law Gn (x) Gn (f ) √ N (H Sn (x) − Fcn (x)) ∫ U f (x)dGn (x), f ∈ C (U ) γm the contour formed by the boundary of rectangle with √ √ vertices (al ± i/ m) and (br ± i/ m) fm (x) Bernstein polynomial functions An = Op (Bn ) limt→∞ supn P (|An /Bn | ≥ t) = ∥xn ∥ Euclidean norm for any vector xn ∈ Cn ∥A∥ spectral norm of matrices, i.e. ∥A∥ = ∥F (x)∥ norm of functions, i.e. ∥F (x)∥ = supx |F (x)| √ ∗ λAA max 4.3 Proof 122 Thus, for v = N −11/80 , we obtain √ K N |EmH = o(1). n (z) − mn (z)| ≤ √ N v 3/2 For fm (z) and ∥γm ∥ are bounded over γm . Therefore, we conclude that the mean function is 0. 4.3.3 Appendix Lemma 4.1. (Burkholder)(Lemma 2.2 of Bai and Silverstein (1998)) Let Xk , k = 1, 2, · · · , be a complex martingale difference sequence with respect to the increasing σ-fields Fk . Then, for p > 1, E ∑ p Xk ≤ Kp E (∑ |Xk | )p/2 . where Kp is a constant which depends upon p only. Lemma 4.2. (Lemma 2.6 of Bai and Silverstein (1995)) Let z ∈ C+ with v = ℑz, A and B Hermitian and r ∈ Cn . Then ( ) r∗ (B − zI)−1 A(B − zI)−1 r ∥A∥ tr (B − zI)−1 − (B + rr∗ − zI)−1 A = ≤ . + r∗ (B − zI)−1 r v Lemma 4.3. ((1.15) in Bai and Silverstein (2004)) 4.3 Proof 123 Let X = (X1 , · · · , Xn ), where Xi ’s are i.i.d. complex random variables with mean zero and variance 1. Let A = (aij )n×n and B = (bij )n×n be complex matrices. Then the following identity holds: E(X∗ AX − trA)(X∗ BX − trB) = (E|X1 | − |EX12 |2 − 2) n ∑ aii bii + |EY12 |2 trABT + trAB. i=1 Lemma 4.4. (Lemma 2.7 of Bai and Silverstein (1998)) For X = (X1 , · · · , Xn )′ with i.i.d. standardized real or complex entries such that EXi = and E|Xi |2 = 1, and for C an n × n complex matrix, we have, for any p ≥ 2, E|X∗ CX − trC|p ≤ Kp [( E|Xi |4 trCC∗ )p/2 ] + E|Xi |2p tr(CC∗ )p/2 . where Kp is a constant which depends upon p only. Lemma 4.5. (Theorem 5.9 of Bai and Silverstein (2010)) Suppose that the entries of the matrix Xn = (Xjk , j ≤ n, k ≤ N ) are independent (not necessarily identically distributed) and satisfy (1) EXjk = 0, (2) |Xjk | ≤ √ N δN , (3) maxj,k |E|Xjk |2 − σ | → as N → ∞, and 4.3 Proof 124 √ (4) E|Xjk |l ≤ b( N δN )l−3 for all l ≥ 3, where δN → and b > 0. Let Sn = Xn X∗n /N . Then, for any x > ϵ > and integers j, k ≥ 2, we have ( ) ( )−k √ √ P λmax (Sn ) ≥ σ (1 + c)2 + x ≤ KN −k σ (1 + c)2 + x − ϵ for some constant K > 0. Lemma 4.6. (Proposition 4.1 in Bai, Miao and Yin (2003)) If |z| < A, v ≥ O(N −1/2 ) and l ≥ 1, then K E|mn (z) − Emn (z)| ≤ 2l 4l 2l N v c 2l where A is a positive constant, vc = − ( )l v ∆+ , vc √ √ cn + v and ∆ ≡ ∥EF Sn − Fcn ∥. Lemma 4.7. (Lemma 9.1 in Bai and Silverstein (2010)) Suppose that Xi , i = 1, · · · , n, are independent, with EXi = 0, E|Xi |2 = 1, √ sup E|Xi |4 = ν < ∞ and |Xi | ≤ η n with η > 0. Assume that A is a complex matrix. Then for any given p such that ≤ p ≤ b log(nν −1 η ) and b > 1, we have E|α∗ Aα − tr(A)|p ≤ νnp (nη )−1 (40b2 ∥A∥η )p , where α = (X1 , · · · , Xn )T . Lemma 4.8. If |b1 (z)| ≤ K, then for any fixed t > 0, P (|β1 (z)| > 2K) = o(N −t ). 4.3 Proof 125 Proof. Note that if |b1 (z)ξ˜1 (z)| ≤ 1/2, then |β1 (z)| = |b1 (z)| |b1 (z)| ≤ ≤ 2K. |1 + b1 (z)ξ˜1 (z)| − |b1 (z)ξ˜1 (z)| Thus, ( ) P (|β1 (z)| > 2K) ≤ P |b1 (z)ξ˜1 (z)| > ( ) ≤ P |ξ˜1 (z)| > 2K ≤ (2K)p E|ξ˜1 (z)|p . By the Cr -inequality, Lemma 4.6 and Lemma 4.7, for some η = ηN N −1/4 and p ≥ log N , we have E ξ˜1 (z) p p 1 trA−1 EtrA−1 = E |ξ1 (z)| + E (z) − (z) N N ( −1 )−1 ( −1 −1/2 )p K ≤ K N ηN N v ηN N + p/2 N p v 3p/2 vc p 2p−4 p ≤ KηN ≤ KηN . −1 For any fixed t > 0, when N is large enough so that log ηN > t + 1, it can be shown that −1 E|ξ1 (z)|p ≤ Ke−p log ηN ≤ Ke−p(t+1) ≤ Ke−(t+1) log N = KN −t−1 = o(N −t ). 4.3 Proof 126 ( ) Similarly, we can show that P |β˜1 (z)| > 2K = o(N −t ) and E|ξ1 (z)|p = o(N −t ), for p ≥ log N . z )xn | ≤ K/v. Lemma 4.9. There exists a constant K, such that E |x∗n A−1 (z)A−1 (¯ Proof. In the chapter, we assume that c = lim n/N is away from 1, according to (8.4.9) in Bai and Silverstein (2010) that m(z) is bounded by a constant, thus we have E|mH n (z)| ≤ K for some constant K. Therefore, E x∗n A−1 (z)A−1 (¯ z )xn = E v −1 ℑ(x∗n A−1 (z)xn ) ≤ K/v. Lemma 4.10. |∇2 | ≤ Op (N −ϵ0 ), when m = [N 1/4+ϵo ]. Proof. Using integral by parts, we get √ ∫ |∇2 | = N (fm (x) − f (x))d(H Sn (x) − Fcn (x)) √ ∫ br ′ ( ) (fm (x) − f ′ (x)) H Sn (x) − Fcn (x) dx ; N = al Based on theorem 3.2 in chapter 3, we know ∥H Sn − Fcn ∥ = Op (N −1/4 ), thus ∫ |∇2 | ≤ KN b0 1/4 ′ (y) − f˜′ (y) dy, f˜m a0 where a0 = Lal + t, b0 = Lbr + t, and L = − 2ϵ (al + br )ϵ − al ,t= . br − a l br − al 4.3 Proof 127 For ′ f˜m (y) ( ) m ( ) ∑ k m−k ˜ k m k m−k − f( ) = y (1 − y) k y 1−y m k=0 ) ( m ∑ k − my ˜ k m k = y (1 − y)m−k f ( ). k y(1 − y) m k=0 Then using Taylor expansion, k f˜( ) = f˜(y) + f˜′ (y) m ( ) ( )2 k ˜′′ k − y + f (ξk,y ) −y , m m where ξk,y is a number between k/m and y. ′ Substituting the above equality into f˜m (y), we have ( )2 m ( ) k 1∑ m k m−k k − my ′ ′ ˜ ˜ y (1 − y) − y f˜′′ (ξk,y ) fm (y) − f (y) = k=0 k y(1 − y) m = O( ). m Therefore, |∇2 | ≤ K 1/4 N = Op (N −ϵ0 ). m Lemma 4.11. Under the condition of Theorem 4.1, we have n ( ∑ ] ) ] [ [ −1 ∗ −1 ∗ −1 = op (1), A (z ) A (z ) E A (z )x x Ej A−1 (z )x x max Ej−1 j n n n j n j j j ii ii j i=1 ′ uniformly in γmh ∪ γmh , where the maximum is taken over all ≤ i ≤ n and ≤ j ≤ N. 4.3 Proof 128 Proof. First, let ei (1 ≤ i ≤ n) be the n-vector whose ith element is 1, the rest being and e′i , the transpose of ei . Then by using Lemma 4.4 and Cauchy-Schwartz inequality, we obtain [ −1 ] [ ] ∗ −1 ∗ −1 E A−1 j (z)xn xn Aj (z) ii − A (z)xn xn Aj (z) ii ( ) −1 ∗ −1 = E e′i A−1 j (z) − A (z) xn xn Aj (z)ei by (4.3) ∗ −1 ∗ −1 = E βj (z)e′i A−1 j (z)sj sj Aj (z)xn xn Aj (z)ei ) K( 1/2 ∗ −1 ′ −1 ∗ −1 ′ −2 A A A A E X∗j A−1 (z)x x (z)e e (z)X − x (z)e e (z)x n i j i n n j i j n j i j j N K + E x∗n Aj−1 (z)ei e′i A−2 j (z)xn N K . ≤ N v2 ≤ Similarly, we can show that [ ] [ −1 ] K ∗ −1 E A−1 (z)xn x∗n A−1 . j (z) ii − A (z)xn xn A (z) ii ≤ N v2 Second, by martingale inequality, for any ϵ > 0, we have ( ) −1 ∗ −1 −1 ∗ −1 P max Ej [A (z)xn xn A (z)]ii − E[A (z)xn xn A (z)]ii > ϵ i,j ) ( −1 ∗ −1 −1 ∗ −1 ≤ nP max Ej [A (z)xn xn A (z)]ii − E[A (z)xn xn A (z)]ii > ϵ j n E [A−1 (z)xn x∗n A−1 (z)]ii − E[A−1 (z)xn x∗n A−1 (z)]ii ϵ ( N n ∑ = 6E (El − El−1 ) [A−1 (z)xn x∗n A−1 (z)]ii − [A−1 (z)xn x∗n A−1 l (z)]ii ϵ l=1 )6 −1 ∗ −1 −1 ∗ −1 +[A (z)xn xn Al (z)]ii − [Al (z)xn xn Al (z)]ii ≤ 4.3 Proof 129 ≤ nKE (∑ N ( (El − El−1 ) [A−1 (z)xn x∗n A−1 (z)]ii − [A−1 (z)xn x∗n A−1 l (z)]ii l=1 −1 +[A ≤ nKE ) )3 (z)xn x∗n A−1 l (z)]ii (∑ N − ∗ −1 [A−1 l (z)xn xn Al (z)]ii −1 (z)xn x∗n A−1 (z)]ii − ∗ −1 [A−1 l (z)xn xn Al (z)]ii (El − El−1 ) [A − [A l=1 + [A ≤ nK −1 N 2v4 l=1 (z)xn x∗n A−1 l (z)]ii )3 (z)xn x∗n A−1 l (z)]ii ( N ∑ −1 )3 ≤ K . N v 12 Together with E[A−1 (z)xn x∗n A−1 (z)]ii = ∗ −2 Ex A (z)xn . n n Finally, we obtain ( max Ej−1 j N ∑ [ ] [ ] −1 ∗ −1 ∗ −1 Ej A−1 j (z1 )xn xn Aj (z1 ) ii Ej Aj (z2 )xn xn Aj (z2 ) ii i=1 −1 = n E[A (z1 )xn x∗n A−1 (z1 )]ii × E[A−1 (z2 )xn x∗n A−1 (z2 )]ii + op (1) = |Ex∗n A−2 (z1 )xn | × |Ex∗n A−2 (z2 )xn | + op (1) = op (1). n The proof of Lemma 4.11 is complete. ) 130 CHAPTER Conclusion and Future Research 5.1 Conclusion This thesis investigates the limiting behavior of the eigenvectors of large dimensional sample covariance matrix Sn through a new form of empirical spectral distribution, VESD, defined by eigenvalues and eigenvectors of Sn . In Chapter 3, we establish various convergence rates of the VESD to the MP law under at most finite 10th moment condition of the underlying distribution when data dimension n and sample size N proportionally tend to infinity: the rate 5.2 Future Research 131 for the expected VESD is O(N −1/2 ) when the index cn = n/N is bounded away from and 1. Moreover, it is also proved that the convergence rate in probability of the VESD is O(N −1/4 ) and the almost sure convergence rate is O(N −1/4+η ), for any fixed η > 0. In Chapter 4, we establish the central limit theorem of linear spectral statistics associated with VESD. Using Bernstein polynomial approximations, we prove the central limit theorem for linear spectral statistics of H Sn , indexed by a set of functions with continuous second order derivatives over an interval including the support of Marcenko-Pastur law. This result provides further evidences to support the conjecture that the eigenmatrix of sample covariance matrix is asymptotically Haar distributed. 5.2 Future Research • The theorems derived in Chapter are established for the simple sample covariance matrix Sn = (1/N )Xn X∗n . One could extend our work for the generalized sample covariance matrix (1/N )Tn Xn X∗n Tn . 1/2 1/2 • Examining the proof of Theorem 4.1, the assumption of the second continuous derivative of f is related to the convergence rate of ∥H Sn − Fcn ∥ 5.2 Future Research established in Theorem 3.2 in Chapter 3. It is conjectured that the convergence rate of ∥H Sn − Fcn ∥ can be improved to O(N −1/2 ) according to the result of Theorem 3.1. Using the same strategy as in chapter 4, our result Theorem 4.1 still hold under the first continuous derivative condition. In that case, it will help us have a better understanding about the tightness of continuous functionals {Gn (f )}. 132 133 Bibliography [1] Anderson, T.W. (2003). An Introduction to Multivariate Statistical Analysis, 3rd ed. Wiley Series in Probability and Statistics. Wiley, New York. [2] Bai, Z. D. (1993). Convergence rate of expected spectral distributions of large random matrices. Part I. Wigner matrices. Ann. Probab. 21, 625-648. [3] Bai, Z. D. (1993). Convergence rate of expected spectral distributions of large random matrices. Part II. Sample covariance matrices. Ann. Probab. 21, 649-672. [4] Bai, Z. D., Miao, B. Q. and Pan, G. M. (2007). On asymptotics of eigenvectors of large sample covariance matrix. Ann. Probab. 35, 1532-1572. [5] Bai, Z. D., Miao, B. Q. and Yao, J. F. (2003). Convergence rates of spectral distributions of large sample covariance matrices. SIAM J. Matrix Anal. Appl. 25, 105-127. Bibliography [6] Bai, Z. D. and Pan, G. M. (2011). Eigenvectors of large Wigner matrices. Journal of Statistical Physics, 146 519-549. [7] Bai, Z.D. and Saranadasa, H. (1996). Effect of high dimension comparison of significance tests for a high dimensional two sample problem. Statistica Sinica. 311-329 [8] Bai, Z. D. and Silverstein, J. W. (1998). No eigenvalues outside the support of the limiting spectral distribution of large-dimensional sample covariance matrices. Ann. Probab. 26 316-345. [9] Bai, Z. D. and Silverstein, J. W. (2004). CLT for linear spectral statistics of large-dimensional sample covariance matrices. Ann. Probab. 32 553-605. [10] Bai, Z. D., Silverstein, J. W. and Yin, Y. Q. (1988). A note on the largest eigenvalue of a large dimensional sample covariance matrix. J. Multivariate Anal. 26 166-168. [11] Bai, Z.D. and Silverman, J.W. (2010). Spectral Analysis of Large Dimensional Random Matrices, 2nd ed. Springer Series in Statistics. Springer, New York. [12] Bai, Z. D., Wang, X. Y. and Zhou, W. (2010). Functional CLT for sample covariance matrices. Bernoulli 16 1086-1113. [13] Bai, Z. D. and Yin, Y. Q. (1993). Limit of the smallest eigenvalue of a large dimensional sample covariance matrix. Ann. Probab. 21 1275-1294. [14] Bai, Z. D. and Yin, Y. Q. (1988). Necessary and sufficient conditions for the almost sure convergence of the largest eigenvalue of Wigner matrices. Ann. Probab. 16 1729-1741. [15] Billingsley, P. (1995). Probability and measure, 3nd ed. Wiley, New York. ¨ tze, F. and Tikhomirov, A. N. (2010). On concentra[16] Bobkov, S. G., Go tion of empirical measures and convergence to the semi-circle law. J. Theoret. Probab. 23 792-823. [17] Bordenave, C. and Guionnet, A. (2012) Localization and delocalization of eigenvectors for heavy-tailed random matrices. arXiv:1201.1862. 134 Bibliography [18] Dempster, A. P. (1958). A high dimensional two sample significance test. Ann. Math. Statist. 29 995-1010. [19] Diaconis, P. and Evans, S. N. (2001). Linear functionals of eigenvalues of random matrices. Trans. Amer. Math. Soc 353 2615-2633. [20] Eldar, Y. C. and Chan, A. M. (2003). On the asymptotic performance of the decorrelator. IEEE Trans. Inform. Theory 49 2309-2313. ¨ s, L., Schlein, B. and Yau, H-T. (2009). Local semicircle law and [21] Erdo complete delocalization for Wigner random matrices. Commun. Math. Phys. 287 641-655. ¨ s, L., Schlein, B. and Yau, H-T. (2009). Semicircle law on short [22] Erdo scales and delocalization of eigenvectors for Wigner random matrices. Ann. Prob. 37(3) 815-852. ´szla ´ , Homg-Tzer Yau, and Jun Yin. (2011) Rigidity of eigenvalues of [23] La generalized wigner matrices. arXiv:1007.4652v7. `ve, M. (1977). Probability Theory, 4th ed. Springer-Verlag, New York. [24] Loe [25] Geman, S. (1980). A limit theorem for the norm of random matrices. Ann. Probab. 252-261. ¨ tze, F. and Tikhomirov, A. (2004). Rate of convergence in probability [26] Go to the Marchenko-Pastur law. Bernoulli 10 503-548. [27] Grenander, U. and Silverstein, J. W. (1977). Spectral analysis of networks with random topologies. SIAM J. Appl. Math. 32 499-519. [28] Halmos, P. R. (1950). Measure Theory, D. Van Nostrand Company, Inc., New York. [29] Jiang, T. F. (2006). How many entries of a typical orthogonal matrix can be approximated by independent normals. Ann. Probab. 34 1497-1529. [30] Jiang, T. F. (2005). Maxima of entries of Haar distributed matrices. Probab. Theory Relat. Fields 131 121-144. 135 Bibliography [31] Johansson, K. (2000). Sharp fluctuations and random matrices. Ann. Stat. 38 3724-3750. [32] Jonsson, D. (1982). Some limit theorems for the eigenvalues of a sample covariance matrix. J. Multivariate Anal. 12 1-38. [33] Knowles, A. and Yin, J (2011). Eigenvector distribution of wigner matrices. arXiv:1102.0057v4. [34] Madan Lal Mehta (2004). Random matrices, 3rd ed. Academic Press. ˇenko, V. A. and Pastur, L. A. (1967). Distribution of eigenvalues [35] Marc for some sets of random matrices. Math. USSR-Sb. 457-483. [36] Pillai, N. S. and Yin, J. (2012) Universality of covariance matrices. arXiv:1110.2501v3. [37] Schenker, J. (2009). Eigenvector localization for random band matrices with power law band width. Commun. Math. Phys. 290(3) 1065-1097. [38] Silverstein, J. W. (1981). Describing the behavior of eigenvectors of random matrices using sequences of measures on orthogonal groups. SIAM J. Math. Anal. 12 274-281. [39] Silverstein, J. W. (1984). Some limit theorems on the eigenvectors of large dimensional sample covariance matrices. J. Multivariate Anal. 15 295-324. [40] Silverstein, J. W. (1989). On the eigenvectors of large dimensional sample covariance matrices. J. Multivariate Anal. 30 1-16. [41] Silverstein, J. W. (1990). Weak convergence of random functions defined by the eigenvectors of sample covariance matrices. Ann. Probab. 18 1174-1194. [42] Silverstein, J. W. (1995). Strong convergence of the empirical distribution of eigenvalues of large dimensional random matrices. J. Multivariate Anal. 55 331-339. [43] Silverstein, J. W. and Bai, Z. D. (1995). On the empirical distribution of eigenvalues of a class of large dimensional random matrices. J. Multivariate Anal. 54 175-192. 136 Bibliography [44] Sinai, Ya. and Soshnikov, A. (1998). Central limit theorem for traces of large random symmetric matrices with independent matrix elements. Bol. Soc. Brasil. Mat. (N. S.) 29 1-24. [45] Su, Z. (2006). Gaussian fluctuations in complex sample covariance. Electronic Journal of Prob. 11 1284-1320. [46] Tao, T. and Vu, V. (2012) Random matrices: sharp concentration of eigenvalues. arXiv:1201.4789v2. [47] Tao, T. and Vu, V. (2011). Random matrices: Universal properties of eigenvectors. arXiv:1103.2801v2. [48] Tracy, C. A. and Widom, H. (1994). Level-spacing distributions and the Airy kernel. Comm. Math. Phys 159 151-174. [49] Wachter, K. W. (1978). The strong limits of random matrx spectra for sample matrices of independent elements. Ann. Stat. 1-18. [50] Wigner, E. P. (1958). On the distributions of the roots of certain symmetric matrices. Ann. Math. 67 325-327. [51] Yin, Y. Q., Bai, Z. D. and Krishnaiah, P. R. (1988). On the limit of the largest eigenvalue of the large dimensional sample covariance matrix. Probab. Theory Related Fields 78 509-521. 137 [...]... importance of spectral analysis, practical applications of RMT have also raised the need for a better understanding to the limiting behavior of eigenvectors of large dimensional random matrices For example, in principal component analysis (PCA), the eigenvectors corresponding to a few of the largest eigenvalues of random matrices (that is, the directions of the principal components) are of special... sequences of random matrices with dimension 1.1 Large Dimensional Random Matrices (number of rows) tending to infinity One of the main problems in RMT is to investigate the convergence of the sequence of empirical spectral distributions {F An } for a given sequence of random matrices {An } The limit distribution F , which is usually nonrandom, is called the Limiting Spectral Distribution (LSD) of the sequence... 1.1 Large Dimensional Random Matrices The development of Random Matrices Theory (RMT) comes from the fact that the classical multivariate analysis is no longer suitable for dealing with large dimensional problems All classical multivariate analysis assumes that the dimension of the data is small and fixed and the number of observations, or sample size, is large and tends to infinity However, most of cases... (1988) weaken the conditions into the assumption of the existence of fourth moment And in the same year, they further illustrated that the fourth moment condition is a necessary and sufficient condition for the existence of the limit of the largest eigenvalue of a large dimensional sample covariance matrix Identifying the limit of smallest eigenvalue of a large dimensional sample covariance matrix proves... b] with a > 0 lies outside the support of Fc,H and Fcn ,Hn for all large n Here Fcn ,Hn is the limiting nonrandom distribution function associated with the limiting ratio cn and distribution function Hn Then P (no eigenvalue of Bn appears in [a, b] for all large n) = 1 2.1.3 Convergence Rate The third study on the spectral analysis of large dimensional random matrices goes to the investigation on its... existence of the limiting spectral distribution of a class of large dimensional random matrices This method gives no contribution to the convergence rate As an open problem, the convergence rate of empirical spectral distribution puzzled statisticians for decades until 1993 For the first time, Bai established a Berry-Essen type inequality of the difference of two empirical spectral distributions in terms of. .. dependence of F on c and H, the continuous density of F on R+ and a method of determining its support 2.1 Spectral Analysis 2.1.2 Limits of Extreme Eigenvalues The second study on the spectral analysis of sample covariance matrices is the investigation on the limits of extreme eigenvalues The first work in this direction was done by Geman (1980) He proved that the largest eigenvalue of a large dimensional. .. dimensional random matrices, referring to Mehta (1990) The interest of spectral analysis of large dimensional random matrices in statistical inference is due to the fact that many important statistics in classical multivariate analysis can be expressed as functionals of the ESD of some random matrices Thus, we can revise the conventional results using random matrix theory and make them effective in applications... analysis of random matrices comes from nuclear physics during the 1950’s There are thousands of energy levels in a quantum system It is impossible to observe all the energy levels individually, but they can be represented by eigenvalues of a certain matrix Since then, lots of theorems and applications are established by physicists and statisticians on the spectral of large dimensional random matrices, ... the directions of the principal components) are of special interest Therefore, the limiting behavior of eigenvectors of large dimensional random matrices becomes an important issue in RMT However, the investigation on eigenvectors has been relatively weaker than that on eigenvalues in the literature due to the difficulty of mathematical formulation since the dimension increases with the sample size 1.2 . LIMITING BEHAVIOR OF EIGENVECTORS OF LARGE DIMENSIONAL RANDOM MATRICES XIA NINGNING NATIONAL UNIVERSITY OF SINGAPORE 2013 LIMITING BEHAVIOR OF EIGENVECTORS OF LARGE DIMENSIONAL RANDOM MATRICES XIA. limiting behavior of eigenvectors of large sample covariance matrices. The main work in this thesis involves two parts. In the first part (Chapter 3), to investigate the limiting behavior of eigenvectors. investigation on limiting properties of eigenvalues and eigenvectors for large dimensional random matrices in RMT. Based on the observation that the limiting spectral properties of large dimen- sional

Ngày đăng: 08/09/2015, 18:24

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan