TENSOR-ON-TENSOR REGRESSION: RIEMANNIAN OPTIMIZATION, OVER-PARAMETERIZATION, STATISTICAL-COMPUTATIONAL GAP AND THEIR INTERPLAY
成果类型:
Article
署名作者:
Luo, Yuetian; Zhang, Anru r.
署名单位:
University of Chicago; Duke University; Duke University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/24-AOS2396
发表日期:
2024
页码:
2583-2612
关键词:
low-rank
matrix completion
approximation
decompositions
algorithms
RECOVERY
SPARSE
trust
摘要:
We study the tensor-on-tensor regression, where the goal is to connect tensor responses to tensor covariates with a low Tucker rank parameter tensor/matrix without prior knowledge of its intrinsic rank. We propose the Riemethods and cope with the challenge of unknown rank by studying the effect of rank over-parameterization. We provide the first convergence guarantee for the general tensor-on-tensor regression by showing that RGD and RGN respectively converge linearly and quadratically to a statistically optimal estimate in both rank correctly-parameterized and over-parameterized settings. Our theory reveals an intriguing phenomenon: Riemannian optimization methods naturally adapt to over-parameterization without modifications to their implementation. We also prove the statistical-computational gap in scalar-on-tensor regression by a direct low-degree polynomial argument. Our theory demonstrates a blessing of statistical-computational gap phenomenon: in a wide range of scenarios in tensor-on-tensor regression for tensors of order three or higher, the computationally required sample size matches what is needed by moderate rank over-parameterization when considering computationally feasible estimators, while there are no such benefits in the matrix settings. This shows moderate rank over-parameterization is essentially cost-free in terms of sample size in tensor-on-tensor regression of order three or higher. Finally, we conduct simulation studies to show the advantages of our proposed methods and to corroborate our theoretical findings.
来源URL: