SPARSITY IN MULTIPLE KERNEL LEARNING
成果类型:
Article
署名作者:
Koltchinskii, Vladimir; Yuan, Ming
署名单位:
University System of Georgia; Georgia Institute of Technology; University System of Georgia; Georgia Institute of Technology
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/10-AOS825
发表日期:
2010
页码:
3660-3695
关键词:
dantzig selector
Lasso
摘要:
The problem of multiple kernel learning based on penalized empirical risk minimization is discussed. The complexity penalty is determined jointly by the empirical L-2 norms and the reproducing kernel Hilbert space (RKHS) norms induced by the kernels with a data-driven choice of regularization parameters. The main focus is on the case when the total number of kernels is large, but only a relatively small number of them is needed to represent the target function, so that the problem is sparse. The goal is to establish oracle inequalities for the excess risk of the resulting prediction rule showing that the method is adaptive both to the unknown design distribution and to the sparsity of the problem.