Publication Type

Conference Proceeding Article

Publication Date

8-2017

Abstract

In this paper, we study how to improve thedomain adaptability of a deletion-basedLong Short-Term Memory (LSTM) neuralnetwork model for sentence compression.We hypothesize that syntactic informationhelps in making such modelsmore robust across domains. We proposetwo major changes to the model: usingexplicit syntactic features and introducingsyntactic constraints through Integer LinearProgramming (ILP). Our evaluationshows that the proposed model works betterthan the original model as well as a traditionalnon-neural-network-based modelin a cross-domain setting.

Discipline

Databases and Information Systems | Programming Languages and Compilers

Publication

Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017 July 30 - August 4

First Page

1385

Last Page

1393

City or Country

Vancouver, Canada

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Share

COinS