Asymptotic Behavior of Adversarial Training Estimator under ℓ∞-Perturbation

成果类型:
Article; Early Access
署名作者:
Xie, Yiling; Huo, Xiaoming
署名单位:
University System of Georgia; Georgia Institute of Technology
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2025.2485346
发表日期:
2025
关键词:
framework Lasso
摘要:
Adversarial training has been proposed to protect machine learning models against adversarial attacks. This article focuses on adversarial training under l(infinity)-perturbation, which has recently attracted much research attention. The asymptotic behavior of the adversarial training estimator is investigated in the generalized linear model. The results imply that the asymptotic distribution of the adversarial training estimator under l(infinity)-perturbation could put a positive probability mass at 0 when the true parameter is 0, providing a theoretical guarantee of the associated sparsity-recovery ability. Alternatively, a two-step procedure is proposed-adaptive adversarial training, which could further improve the performance of adversarial training under l(infinity)-perturbation. Specifically, the proposed procedure could achieve asymptotic variable-selection consistency and unbiasedness. Numerical experiments are conducted to show the sparsity-recovery ability of adversarial training under l(infinity)-perturbation and to compare the empirical performance between classic adversarial training and adaptive adversarial training. Supplementary materials for this article are available online, including a standardized description of the materials available for reproducing the work.
来源URL: