Solving Math Word Problems with Teacher Supervision
Abstract
Math word problems (MWPs) have been recently addressed with Seq2Seq models by `translating' math problems described in natural language to a mathematical expression, following a typical encoder-decoder structure. Although effective in solving classical math problems, these models fail when a subtle variation is applied to the word expression of a math problem, and leads to a remarkably different answer. We find the failure is because MWPs with different answers but similar math formula expression are encoded closely in the latent space. We thus designed a teacher module to make the MWP encoding vector match the correct solution and disaccord from the wrong solutions, which are manipulated from the correct solution. Experimental results on two benchmark MWPs datasets verified that our proposed solution outperforms the state-of-the-art models.
Cite
Text
Liang and Zhang. "Solving Math Word Problems with Teacher Supervision." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/485Markdown
[Liang and Zhang. "Solving Math Word Problems with Teacher Supervision." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/liang2021ijcai-solving/) doi:10.24963/IJCAI.2021/485BibTeX
@inproceedings{liang2021ijcai-solving,
title = {{Solving Math Word Problems with Teacher Supervision}},
author = {Liang, Zhenwen and Zhang, Xiangliang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2021},
pages = {3522-3528},
doi = {10.24963/IJCAI.2021/485},
url = {https://mlanthology.org/ijcai/2021/liang2021ijcai-solving/}
}