Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
7-2020
Abstract
While the recent tree-based neural models have demonstrated promising results in generating solution expression for the math word problem (MWP), most of these models do not capture the relationships and order information among the quantities well. This results in poor quantity representations and incorrect solution expressions. In this paper, we propose Graph2Tree, a novel deep learning architecture that combines the merits of the graph-based encoder and tree-based decoder to generate better solution expressions. Included in our Graph2Tree framework are two graphs, namely the Quantity Cell Graph and Quantity Comparison Graph, which are designed to address limitations of existing methods by effectively representing the relationships and order information among the quantities in MWPs. We conduct extensive experiments on two available datasets. Our experiment results show that Graph2Tree outperforms the state-of-the-art baselines on two benchmark datasets significantly. We also discuss case studies and empirically examine Graph2Tree’s effectiveness in translating the MWP text into solution expressions.
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering
Publication
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
First Page
3928
Last Page
3937
Identifier
10.18653/v1/2020.acl-main.362
Publisher
Association for Computational Linguistics
City or Country
Online
Citation
ZHANG, Jipeng; WANG, Lei; LEE, Roy Ka-Wei; BIN, Yi; WANG, Yan; SHAO, Jie; and LIM, Ee-peng.
Graph-to-tree learning for solving math word problems. (2020). Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 3928-3937.
Available at: https://ink.library.smu.edu.sg/sis_research/5273
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.18653/v1/2020.acl-main.362