Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
8-2017
Abstract
In this paper, we study how to improve thedomain adaptability of a deletion-basedLong Short-Term Memory (LSTM) neuralnetwork model for sentence compression.We hypothesize that syntactic informationhelps in making such modelsmore robust across domains. We proposetwo major changes to the model: usingexplicit syntactic features and introducingsyntactic constraints through Integer LinearProgramming (ILP). Our evaluationshows that the proposed model works betterthan the original model as well as a traditionalnon-neural-network-based modelin a cross-domain setting.
Discipline
Databases and Information Systems | Programming Languages and Compilers
Publication
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017 July 30 - August 4
First Page
1385
Last Page
1393
Identifier
10.18653/v1/P17-1127
City or Country
Vancouver, Canada
Citation
WANG, Liangguo; JIANG, Jing; CHIEU, Hai Leong; ONG, Chen Hui; SONG, Dandan; and LIAO, Lejian.
Can syntax help? Improving an LSTM-based Sentence Compression Model for New Domains. (2017). Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017 July 30 - August 4. 1385-1393.
Available at: https://ink.library.smu.edu.sg/sis_research/3901
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.18653/v1/P17-1127