Naive Feature Selection: A Nearly Tight Convex Relaxation for Sparse Naive Bayes
成果类型:
Article
署名作者:
Askari, Armin; d'Aspremont, Alexandre; El Ghaoui, Laurent
署名单位:
University of California System; University of California Berkeley; Centre National de la Recherche Scientifique (CNRS); Universite PSL; Ecole Normale Superieure (ENS)
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2023.1356
发表日期:
2024
关键词:
摘要:
Because of its linear complexity, naive Bayes classification remains an attractive supervised learning method, especially in very large-scale settings. We propose a sparse version of naive Bayes, which can be used for feature selection. This leads to a combinatorial maximum-likelihood problem, for which we provide an exact solution in the case of binary data, or a bound in the multinomial case. We prove that our convex relaxation bounds become tight as the marginal contribution of additional features decreases using a priori duality gap bounds derived from the Shapley-Folkman theorem. We show how to produce primal solutions satisfying these bounds. Both binary and multinomial sparse models are solvable in time almost linear in problem size, representing a very small extra relative cost compared with the classical naive Bayes. Numerical experiments on text data show that the naive Bayes feature selection method is as statistically effective as state-ofthe-art feature selection methods such as recursive feature elimination, l1-penalized logistic regression, and LASSO while being orders of magnitude faster (a python implementation can be found at https://github.com/aspremon/NaiveFeatureSelection).
来源URL: