Pointer-Guided Pre-Training: Infusing Large Language Models with Paragraph-Level Contextual Awareness

Cite

Text

Hillebrand et al. "Pointer-Guided Pre-Training: Infusing Large Language Models with Paragraph-Level Contextual Awareness." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024. doi:10.1007/978-3-031-70359-1_23

Markdown

[Hillebrand et al. "Pointer-Guided Pre-Training: Infusing Large Language Models with Paragraph-Level Contextual Awareness." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024.](https://mlanthology.org/ecmlpkdd/2024/hillebrand2024ecmlpkdd-pointerguided/) doi:10.1007/978-3-031-70359-1_23

BibTeX

@inproceedings{hillebrand2024ecmlpkdd-pointerguided,
  title     = {{Pointer-Guided Pre-Training: Infusing Large Language Models with Paragraph-Level Contextual Awareness}},
  author    = {Hillebrand, Lars and Pradhan, Prabhupad and Bauckhage, Christian and Sifa, Rafet},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2024},
  pages     = {386-402},
  doi       = {10.1007/978-3-031-70359-1_23},
  url       = {https://mlanthology.org/ecmlpkdd/2024/hillebrand2024ecmlpkdd-pointerguided/}
}