Veracity: An Open-Source AI Fact-Checking System

Abstract

The proliferation of misinformation poses a significant threat to society, exacerbated by the capabilities of generative AI. This demo paper introduces Veracity, an open-source AI system designed to empower individuals to combat misinformation through transparent and accessible fact-checking. Veracity leverages the synergy between Large Language Models (LLMs) and web retrieval agents to analyze user-submitted claims and provide grounded veracity assessments with intuitive explanations. Key features include multilingual support, numerical scoring of claim veracity, and an interactive interface inspired by familiar messaging applications. This paper will showcase Veracity's ability to not only detect misinformation but also explain its reasoning, fostering media literacy and promoting a more informed society.

Cite

Text

Curtis et al. "Veracity: An Open-Source AI Fact-Checking System." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/1254

Markdown

[Curtis et al. "Veracity: An Open-Source AI Fact-Checking System." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/curtis2025ijcai-veracity/) doi:10.24963/IJCAI.2025/1254

BibTeX

@inproceedings{curtis2025ijcai-veracity,
  title     = {{Veracity: An Open-Source AI Fact-Checking System}},
  author    = {Curtis, Taylor Lynn and Touzel, Maximilian Puelma and Garneau, William and Gruaz, Manon and Pinder, Mike and Wang, Li Wei and Krishna, Sukanya and Cohen, Luda and Godbout, Jean-François and Rabbany, Reihaneh and Pelrine, Kellin},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {11021-11024},
  doi       = {10.24963/IJCAI.2025/1254},
  url       = {https://mlanthology.org/ijcai/2025/curtis2025ijcai-veracity/}
}