Fast rates for support vector machines using gaussian kernels'

成果类型:
Article
署名作者:
Steinwart, Ingo; Scovel, Clint
署名单位:
United States Department of Energy (DOE); Los Alamos National Laboratory
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/009053606000001226
发表日期:
2007
页码:
575-607
关键词:
Empirical Processes Concentration inequalities CLASSIFICATION Consistency RISK DISCRIMINATION CONVERGENCE CLASSIFIERS SPACES bounds
摘要:
For binary classification we establish learning rates up to the order of n(-1) for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.