ON SURROGATE LOSS FUNCTIONS AND f-DIVERGENCES
成果类型:
Article
署名作者:
Nguyen, XuanLong; Wainwright, Martin J.; Jordan, Michael I.
署名单位:
Duke University; University of California System; University of California Berkeley; University of California System; University of California Berkeley
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/08-AOS595
发表日期:
2009
页码:
876-904
关键词:
decentralized detection
distance measures
Consistency
CLASSIFICATION
DESIGN
摘要:
The goal of binary classification is to estimate a discriminant function gamma from observations of covariate vectors and corresponding binary labels. We consider an elaboration of this problem in which the covariates are not available directly but are transformed by a dimensionality-reducing quantizer Q. We present conditions on loss functions such that empirical risk minimization yields Bayes consistency when both the discriminant function and the quantizer are estimated. These conditions are stated in terms of a general correspondence between loss functions and a class of functionals known as Ali-Silvey or f-divergence functionals. Whereas this correspondence was established by Blackwell [Proc. 2nd Berkeley Symp. Probab. Statist. 1 (1951) 93-102. Univ. California Press, Berkeley] for the 0-1 loss, we extend the correspondence to the broader class of surrogate loss functions that play a key role in the general theory of Bayes consistency for binary classification. Our result makes it possible to pick out the (strict) subset of surrogate loss functions that yield Bayes consistency for joint estimation of the discriminant function and the quantizer.