A Bias-Accuracy-Privacy Trilemma for Statistical Estimation
成果类型:
Article; Early Access
署名作者:
Kamath, Gautam; Mouzakis, Argyris; Regehr, Matthew; Singhal, Vikrant; Steinke, Thomas; Ullman, Jonathan
署名单位:
University of Waterloo; Alphabet Inc.; DeepMind; Northeastern University
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2024.2443275
发表日期:
2025
关键词:
摘要:
Differential privacy (DP) is a rigorous notion of data privacy, used for private statistics. The canonical algorithm for differentially private mean estimation is to first clip the samples to a bounded range and then add noise to their empirical mean. Clipping controls the sensitivity and, hence, the variance of the noise that we add for privacy. But clipping also introduces statistical bias. This tradeoff is inherent: we prove that no algorithm can simultaneously have low bias, low error, and low privacy loss for arbitrary distributions. Additionally, we show that under strong notions of DP (i.e., pure or concentrated DP), unbiased mean estimation is impossible, even if we assume that the data is sampled from a Gaussian. On the positive side, we show that unbiased mean estimation is possible under a more permissive notion of differential privacy (approximate DP) if we assume that the distribution is symmetric. Supplementary materials for this article are available online, including a standardized description of the materials available for reproducing the work.