Abstract: Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide array of applications. This talk considers both minimax and adaptive estimation of the principal subspace in the high-dimensional setting. Under mild technical conditions, we first establish the optimal minimax rate of estimating the principal subspace which are sharp with respect to all parameters, thus providing a complete non-asymptotic characterization of the statistical difficulty of the problem. The rate-optimal estimator is constructed using aggregation, which, however, might not be computationally feasible.
We then introduce an adaptive procedure for estimating the principal subspace which is fully data-driven and computationally efficient. It is shown that the estimator attains the minimax rate simultaneously over a large collection of the parameter spaces. A key idea in our construction is a novel reduction scheme which reduces the sparse PCA problem to a high-dimensional regression problem.
Bio: Yihong Wu received the B.E. degree from Tsinghua University, Beijing, China, in 2006 and the M.A. and Ph.D. degrees from Princeton University, Princeton, NJ, in 2008 and 2011, all in electrical engineering. Prior to joining the ECE department at University of Illinois in 2013 as an assistant professor, he was a postdoctoral fellow with the Statistics Department at The Wharton School of University of Pennsylvania. His research interests are in information theory, high-dimensional statistics, communications, stochastic control and optimal transportation.
Dr. Wu was a recipient of the 2011 Marconi Society Paul Baran Young Scholar Award and the Best Student Paper Award at the 2011 IEEE International Symposiums on Information Theory (ISIT). His final year of graduate studies was supported by a Princeton University Honorific Wallace Memorial Fellowship (2010-2011).