Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)
Abstract
Seq-to-seq generative models recently gained attention for solving the relation extraction task. By approaching this problem as an end-to-end task, they surpassed encoder-based-only models. Little research investigated the effects of the output syntaxes on the training process of these models. Moreover, a limited number of approaches were proposed for extracting ready-to-load knowledge graphs following the RDF standard. In this paper, we consider that a set of triples can be linearized in many different ways, and we evaluate the combined effect of the size of the language models and different RDF syntaxes on the task of relation extraction from Wikipedia abstracts.
Cite
Text
Ringwald et al. "Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30502Markdown
[Ringwald et al. "Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/ringwald2024aaai-well/) doi:10.1609/AAAI.V38I21.30502BibTeX
@inproceedings{ringwald2024aaai-well,
title = {{Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)}},
author = {Ringwald, Célian and Gandon, Fabien and Faron, Catherine and Michel, Franck and Akl, Hanna Abi},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {23631-23632},
doi = {10.1609/AAAI.V38I21.30502},
url = {https://mlanthology.org/aaai/2024/ringwald2024aaai-well/}
}