Publication Type
Conference Proceeding Article
Version
submittedVersion
Publication Date
3-2018
Abstract
Machine comprehension is concerned with teaching machines to answer reading comprehension questions. In this paper we adopt an LSTM-based model we designed earlier for textual entailment and propose two new models for cloze-style machine comprehension. In our first model, we treat the document as a premise and the question as a hypothesis, and use an LSTM with attention mechanisms to match the question with the document. This LSTM remembers the best answer token found in the document while processing the question. Furthermore, we observe some special properties of machine comprehension and propose a two-layer LSTM model. In this model, we treat the question as a premise and use LSTMs to match each sentence in the document with the question. We further chain up the final states of these LSTMs using another LSTM in order to aggregate the results. When evaluated on the commonly used CNN/Daily Mail dataset, both of our models are quite competitive compared with the state of the art, and the second two-layer model outperforms the first model.
Discipline
Artificial Intelligence and Robotics | Databases and Information Systems
Research Areas
Data Science and Engineering
Publication
Computational Linguistics and Intelligent Text Processing: 19th International Conference, CICLing 2018, Hanoi, Vietnam, March 18-24: Revised Selected Papers
Publisher
Springer
City or Country
Cham
Citation
WANG, Shuohang and JIANG, Jing.
An LSTM model for cloze-style machine comprehension. (2018). Computational Linguistics and Intelligent Text Processing: 19th International Conference, CICLing 2018, Hanoi, Vietnam, March 18-24: Revised Selected Papers.
Available at: https://ink.library.smu.edu.sg/sis_research/4084
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.