Isotonic mechanism for exponential family estimation in machine learning peer review
成果类型:
Article; Early Access
署名作者:
Yan, Yuling; Su, Weijie J.; Fan, Jianqing
署名单位:
University of Wisconsin System; University of Wisconsin Madison; University of Pennsylvania; Princeton University
刊物名称:
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY
ISSN/ISSBN:
1369-7412
DOI:
10.1093/jrsssb/qkaf025
发表日期:
2025
关键词:
regression
bounds
摘要:
In 2023, the International Conference on Machine Learning (ICML) required authors with multiple submissions to rank their papers by perceived quality. In this paper, we leverage these author-specified rankings to enhance peer review in machine learning and artificial intelligence conferences by extending the isotonic mechanism to exponential family distributions. This mechanism produces adjusted scores closely aligned with the original scores while strictly adhering to the author-specified rankings. An appealing feature of the mechanism is its applicability to a broad class of exponential family distributions without requiring knowledge of the specific distribution form. We show an author is incentivized to provide accurate rankings if her utility is a convex additive function of the adjusted review scores. For a subclass of exponential family distributions, we prove that an author reports truthfully only if elicitation involves pairwise comparisons between her submissions, thus highlighting the optimality of rankings in truthful information elicitation. Moreover, the adjusted scores significantly enhance estimation accuracy compared to original scores and achieve near-minimax optimality when ground-truth scores have bounded total variation. We conclude with a numerical analysis using ICML 2023 ranking data, demonstrating substantial estimation improvements in approximating a proxy ground-truth quality of submissions via the isotonic mechanism.
来源URL: