NOISY MATRIX DECOMPOSITION VIA CONVEX RELAXATION: OPTIMAL RATES IN HIGH DIMENSIONS
成果类型:
Article
署名作者:
Agarwal, Alekh; Negahban, Sahand; Wainwright, Martin J.
署名单位:
University of California System; University of California Berkeley; Massachusetts Institute of Technology (MIT); University of California System; University of California Berkeley
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/12-AOS1000
发表日期:
2012
页码:
1171-1197
关键词:
low-rank matrices
linear-regression
摘要:
We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation (sic) of the sum of an (approximately) low rank matrix Theta(star) with a second matrix Gamma(star) endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including factor analysis, multi-task regression and robust covariance estimation. We derive a general theorem that bounds the Frobenius norm error for an estimate of the pair (Theta(star), Gamma(star)) obtained by solving a convex optimization problem that combines the nuclear norm with a general decomposable regularizer. Our results use a spikiness condition that is related to, but milder than, singular vector incoherence. We specialize our general result to two cases that have been studied in past work: low rank plus an entrywise sparse matrix, and low rank plus a columnwise sparse matrix. For both models, our theory yields nonasymptotic Frobenius error bounds for both deterministic and stochastic noise matrices, and applies to matrices Theta(star) that can be exactly or approximately low rank, and matrices Gamma(star) that can be exactly or approximately sparse. Moreover, for the case of stochastic noise matrices and the identity observation operator, we establish matching lower bounds on the minimax error. The sharpness of our nonasymptotic predictions is confirmed by numerical simulations.