Knowledge Entropy Decay During Language Model Pretraining Hinders New Knowledge Acquisition
Abstract
In this work, we investigate how a model's tendency to broadly integrate its parametric knowledge evolves throughout pretraining, and how this behavior affects overall performance, particularly in terms of knowledge acquisition and forgetting. We introduce the concept of knowledge entropy, which quantifies the range of memory sources the model engages with; high knowledge entropy indicates that the model utilizes a wide range of memory sources, while low knowledge entropy suggests reliance on specific sources with greater certainty. Our analysis reveals a consistent decline in knowledge entropy as pretraining advances. We also find that the decline is closely associated with a reduction in the model's ability to acquire and retain knowledge, leading us to conclude that diminishing knowledge entropy (smaller number of active memory sources) impairs the model's knowledge acquisition and retention capabilities. We find further support for this by demonstrating that increasing the activity of inactive memory sources enhances the model's capacity for knowledge acquisition and retention.
Cite
Text
Kim et al. "Knowledge Entropy Decay During Language Model Pretraining Hinders New Knowledge Acquisition." International Conference on Learning Representations, 2025.Markdown
[Kim et al. "Knowledge Entropy Decay During Language Model Pretraining Hinders New Knowledge Acquisition." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/kim2025iclr-knowledge/)BibTeX
@inproceedings{kim2025iclr-knowledge,
title = {{Knowledge Entropy Decay During Language Model Pretraining Hinders New Knowledge Acquisition}},
author = {Kim, Jiyeon and Lee, Hyunji and Cho, Hyowon and Jang, Joel and Hwang, Hyeonbin and Won, Seungpil and Ahn, Youbin and Lee, Dohaeng and Seo, Minjoon},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/kim2025iclr-knowledge/}
}