On the Complexity of Robust PCA and l1-Norm Low-Rank Matrix Approximation

成果类型:
Article
署名作者:
Gillis, Nicolas; Vavasis, Stephen A.
署名单位:
University of Mons; University of Waterloo
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2017.0895
发表日期:
2018
页码:
1072-1084
关键词:
Principal component analysis Missing Data NORM factorization sparsity
摘要:
The low-rank matrix approximation problem with respect to the component-wise l(1)-norm (l(1)-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims to recover a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although l(1)-LRA is strongly believed to be NP-hard, there is, to our knowledge, no formal proof of this fact. In this paper, we prove that l(1)-LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between l(1)-LRA and several other well-known problems, i.e., robust PCA, l(0)-LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {-1; + 1} matrices, and the discrete basis problem, all of which we prove to be NP-hard.