Contextualized news in corporate disclosures: a neural language approach
OA Version
Citation
Abstract
I quantify and explain value-relevant news in textual disclosures using word context. I improve upon traditional methods by: (i) modeling disclosures as sequentially connected and interacting elements (rather than stand-alone narrative attributes), and (ii) directly predicting the magnitude and direction of disclosure news. I apply a new textual analysis approach—a BERT-based neural language model—and find that contextualized news in quarterly earnings announcement text explains five times more variation in short-window stock returns than traditional narrative attributes and offers large incremental explanatory power relative to reported earnings. I also demonstrate that contextualized disclosures strongly predict future earnings, and that large news content arises from (a) word sequencing and connecting words (i.e., context), (b) text describing numbers, and (c) text at the beginning of disclosures. Overall, this study highlights the importance of contextualized disclosures for researchers, regulators, and practitioners.