ASYMPTOTIC OPTIMALITY OF THE FAST RANDOMIZED VERSIONS OF GCV AND CL IN RIDGE-REGRESSION AND REGULARIZATION

成果类型:
Article
署名作者:
GIRARD, DA
署名单位:
Communaute Universite Grenoble Alpes; Universite Grenoble Alpes (UGA)
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/aos/1176348380
发表日期:
1991
页码:
1950-1963
关键词:
generalized cross-validation least-squares problems smoothing noisy data spline functions parameters
摘要:
Ridge regression is a well-known technique to estimate the coefficients of a linear model. The method of regularization is a similar approach commonly used to solve underdetermined linear equations with discrete noisy data. When applying such a technique, the choice of the smoothing (or regularization) parameter h is crucial. Generalized cross-validation (GCV) and Mallows' C(L) are two popular methods for estimating a good value for h, from the data. Their asymptotic properties, such as consistency and asymptotic optimality, have been largely studied [Craven and Wahba (1979); Golub, Heath and Wahba (1979); Speckman (1985)]. Very interesting convergence results for the actual (random) parameter given by GCV and C(L) have been shown by Li (1985, 1986). Recently, Girard (1987, 1989) has proposed fast randomized versions of GCV and C(L). The purpose of this paper is to show that the above convergence results also hold for these new methods.