QUANTIFYING REPLICABILITY OF MULTIPLE STUDIES IN A META-ANALYSIS

成果类型:
Article
署名作者:
Xiao, Mengli; Chu, Haitao; Hodges, James S.; Lin, Lifeng
署名单位:
University of Colorado System; University of Colorado Anschutz Medical Campus; Pfizer; Pfizer USA; University of Minnesota System; University of Minnesota Twin Cities; University of Arizona
刊物名称:
ANNALS OF APPLIED STATISTICS
ISSN/ISSBN:
1932-6157
DOI:
10.1214/23-AOAS1806
发表日期:
2024
页码:
664-682
关键词:
network replication Heterogeneity CHALLENGES standard tests
摘要:
For valid scientific discoveries, it is fundamental to evaluate whether research findings are replicable across different settings. While large-scale replication projects across broad research topics are not feasible, systematic reviews and meta-analyses (SRMAs) offer viable alternatives to assess replicability. Due to subjective inclusion and exclusion of studies, SRMAs may contain nonreplicable study findings. However, there is no consensus on rigorous methods to assess the replicability of SRMAs or to explore sources of nonreplicability. Nonreplicability is often misconceived as high heterogeneity. This article introduces a new measure, the externally standardized residuals from a leave-m-studies-out procedure, to quantify replicability. It not only measures the impact of nonreplicability from unknown sources on the conclusion of an SRMA but also differentiates nonreplicability from heterogeneity. A new test statistic for replicability is derived. We explore its asymptotic properties and use extensive simulations and real data to illustrate this measure's performance. We conclude that replicability should be routinely assessed for all SRMAs and recommend sensitivity analyses, once nonreplicable study results are identified in an SRMA.
来源URL: