Entailment Graph Learning with Textual Entailment and Soft Transitivity

Abstract

Typed entailment graphs try to learn the entailment relations between predicates from text and model them as edges between predicate nodes. The construction of entailment graphs usually suffers from severe sparsity and unreliability of distributional similarity. We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2). EGT2 learns the local entailment relations by recognizing the textual entailment between template sentences formed by typed CCG-parsed predicates. Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures. Experiments on benchmark datasets show that EGT2 can well model the transitivity in entailment graph to alleviate the sparsity, and leads to significant improvement over current state-of-the-art methods. The released paper can be found in https://arxiv.org/abs/2204.03286.

Cite

Text

Chen et al. "Entailment Graph Learning with Textual Entailment and Soft Transitivity." ICLR 2022 Workshops: DLG4NLP, 2022.

Markdown

[Chen et al. "Entailment Graph Learning with Textual Entailment and Soft Transitivity." ICLR 2022 Workshops: DLG4NLP, 2022.](https://mlanthology.org/iclrw/2022/chen2022iclrw-entailment/)

BibTeX

@inproceedings{chen2022iclrw-entailment,
  title     = {{Entailment Graph Learning with Textual Entailment and Soft Transitivity}},
  author    = {Chen, Zhibin and Feng, Yansong and Zhao, Dongyan},
  booktitle = {ICLR 2022 Workshops: DLG4NLP},
  year      = {2022},
  url       = {https://mlanthology.org/iclrw/2022/chen2022iclrw-entailment/}
}