Differences in misinformation sharing can lead to politically asymmetric sanctions

成果类型:
Article
署名作者:
Mosleh, Mohsen; Yang, Qi; Zaman, Tauhid; Pennycook, Gordon; Rand, David G.
署名单位:
University of Oxford; University of Exeter; Massachusetts Institute of Technology (MIT); Massachusetts Institute of Technology (MIT); Yale University; Cornell University; Massachusetts Institute of Technology (MIT)
刊物名称:
Nature
ISSN/ISSBN:
0028-6800
DOI:
10.1038/s41586-024-07942-8
发表日期:
2024-10-17
页码:
609-+
关键词:
fake news
摘要:
In response to intense pressure, technology companies have enacted policies to combat misinformation(1-4). The enforcement of these policies has, however, led to technology companies being regularly accused of political bias(5-7). We argue that differential sharing of misinformation by people identifying with different political groups(8-15) could lead to political asymmetries in enforcement, even by unbiased policies. We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites-even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople-and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. Political imbalance in enforcement need not imply bias on the part of social media companies implementing anti-misinformation policies.