ADAPTATION IN LOG-CONCAVE DENSITY ESTIMATION
成果类型:
Article
署名作者:
Kim, Arlene K. H.; Guntuboyina, Adityanand; Samworth, Richard J.
署名单位:
University of Cambridge; Sungshin Women's University; University of California System; University of California Berkeley
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/17-AOS1619
发表日期:
2018
页码:
2279-2306
关键词:
maximum-likelihood-estimation
least-squares estimation
global rates
CONVERGENCE
Consistency
摘要:
The log-concave maximum likelihood estimator of a density on the real line based on a sample of size n is known to attain the minimax optimal rate of convergence of O(n(-4/5)) with respect to, for example, squared Hellinger distance. In this paper, we show that it also enjoys attractive adaptation properties, in the sense that it achieves a faster rate of convergence when the logarithm of the true density is k-affine (i.e., made up of k-affine pieces), or close to k-affine, provided in each case that k is not too large. Our results use two different techniques: the first relies on a new Marshall's inequality for log-concave density estimation, and reveals that when the true density is close to log-linear on its support, the log-concave maximum likelihood estimator can achieve the parametric rate of convergence in total variation distance. Our second approach depends on local bracketing entropy methods, and allows us to prove a sharp oracle inequality, which implies in particular a risk bound with respect to various global loss functions, including Kullback-Leibler divergence, of O(k/n log(5/4)(en/k)) when the true density is log-concave and its logarithm is close to k-affine.