CANONICAL NOISE DISTRIBUTIONS AND PRIVATE HYPOTHESIS TESTS

成果类型:
Article
署名作者:
Awan, Jordan; Vadhan, Salil
署名单位:
Purdue University System; Purdue University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/23-AOS2259
发表日期:
2023
页码:
547-572
关键词:
Differential Privacy
摘要:
f-DP has recently been proposed as a generalization of differential pri-vacy allowing a lossless analysis of composition, post-processing, and pri-vacy amplification via subsampling. In the setting of f-DP, we propose the concept of a canonical noise distribution (CND), the first mechanism de-signed for an arbitrary f-DP guarantee. The notion of CND captures whether an additive privacy mechanism perfectly matches the privacy guarantee of a given f . We prove that a CND always exists, and give a construction that pro-duces a CND for any f . We show that private hypothesis tests are intimately related to CNDs, allowing for the release of private p-values at no additional privacy cost, as well as the construction of uniformly most powerful (UMP) tests for binary data, within the general f-DP framework.We apply our techniques to the problem of difference-of-proportions test-ing, and construct a UMP unbiased (UMPU) semiprivate test which upper bounds the performance of any f-DP test. Using this as a benchmark, we pro-pose a private test based on the inversion of characteristic functions, which allows for optimal inference on the two population parameters and is nearly as powerful as the semiprivate UMPU. When specialized to the case of (e, 0) -DP, we show empirically that our proposed test is more powerful than any � (�� 2)-DP test and has more accurate type I errors than the classic normal approximation test.