Math2Sym: A System for Solving Elementary Problems via Large Language Models and Symbolic Solvers
Abstract
Traditional models for solving math word problems (MWPs) often struggle to capture both linguistic context and arithmetic reasoning. We propose Math2Sym, a novel approach integrating large language models (LLMs) with symbolic solvers. This method leverages LLMs' language comprehension and symbolic computation's precision to efficiently convert MWPs into solvable symbolic form. We introduce the EMSF dataset for training models to formalize math problems across various complexities. On our defined test set benchmark, fine-tuned models outperform GPT-3.5 by 17% in few-shot tasks and perform comparably to GPT-4-mini on elementary math problems.
Cite
Text
Nguyen et al. "Math2Sym: A System for Solving Elementary Problems via Large Language Models and Symbolic Solvers." NeurIPS 2024 Workshops: MATH-AI, 2024.Markdown
[Nguyen et al. "Math2Sym: A System for Solving Elementary Problems via Large Language Models and Symbolic Solvers." NeurIPS 2024 Workshops: MATH-AI, 2024.](https://mlanthology.org/neuripsw/2024/nguyen2024neuripsw-math2sym/)BibTeX
@inproceedings{nguyen2024neuripsw-math2sym,
title = {{Math2Sym: A System for Solving Elementary Problems via Large Language Models and Symbolic Solvers}},
author = {Nguyen, Minh Phu and Pham, Minh Phuong and Ngo, Man and Minh, Kha Tuan},
booktitle = {NeurIPS 2024 Workshops: MATH-AI},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/nguyen2024neuripsw-math2sym/}
}