FinBERT: A Large Language Model for Extracting Information from Financial Text
成果类型:
Article
署名作者:
Huang, Allen H.; Wang, Hui; Yang, Yi
署名单位:
Renmin University of China
刊物名称:
CONTEMPORARY ACCOUNTING RESEARCH
ISSN/ISSBN:
0823-9150
DOI:
10.1111/1911-3846.12832
发表日期:
2023
页码:
806-841
关键词:
earnings
READABILITY
disclosure
摘要:
We develop FinBERT, a state-of-the-art large language model that adapts to the finance domain. We show that FinBERT incorporates finance knowledge and can better summarize contextual information in financial texts. Using a sample of researcher-labeled sentences from analyst reports, we document that FinBERT substantially outperforms the Loughran and McDonald dictionary and other machine learning algorithms, including naive Bayes, support vector machine, random forest, convolutional neural network, and long short-term memory, in sentiment classification. Our results show that FinBERT excels in identifying the positive or negative sentiment of sentences that other algorithms mislabel as neutral, likely because it uses contextual information in financial text. We find that FinBERT's advantage over other algorithms, and Google's original bidirectional encoder representations from transformers model, is especially salient when the training sample size is small and in texts containing financial words not frequently used in general texts. FinBERT also outperforms other models in identifying discussions related to environment, social, and governance issues. Last, we show that other approaches underestimate the textual informativeness of earnings conference calls by at least 18% compared to FinBERT. Our results have implications for academic researchers, investment professionals, and financial market regulators.
来源URL: