Why 'open' AI systems are actually closed, and why this matters
成果类型:
Review
署名作者:
Widder, David Gray; Whittaker, Meredith; West, Sarah Myers
署名单位:
Cornell University; University of Western Australia
刊物名称:
Nature
ISSN/ISSBN:
0028-4665
DOI:
10.1038/s41586-024-08141-1
发表日期:
2024-11-28
页码:
827-833
关键词:
perovskite solar-cells
expansion
STABILITY
efficient
BEHAVIOR
摘要:
This paper examines 'open' artificial intelligence (AI). Claims about 'open' AI often lack precision, frequently eliding scrutiny of substantial industry concentration in large-scale AI development and deployment, and often incorrectly applying understandings of 'open' imported from free and open-source software to AI systems. At present, powerful actors are seeking to shape policy using claims that 'open' AI is either beneficial to innovation and democracy, on the one hand, or detrimental to safety, on the other. When policy is being shaped, definitions matter. To add clarity to this debate, we examine the basis for claims of openness in AI, and offer a material analysis of what AI is and what 'openness' in AI can and cannot provide: examining models, data, labour, frameworks, and computational power. We highlight three main affordances of 'open' AI, namely transparency, reusability, and extensibility, and we observe that maximally 'open' AI allows some forms of oversight and experimentation on top of existing models. However, we find that openness alone does not perturb the concentration of power in AI. Just as many traditional open-source software projects were co-opted in various ways by large technology companies, we show how rhetoric around 'open' AI is frequently wielded in ways that exacerbate rather than reduce concentration of power in the AI sector. A review of the literature on artificial intelligence systems to examine openness reveals that open AI systems are actually closed, as they are highly dependent on the resources of a few large corporate actors.