Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2017

Abstract

In this paper, we study how to improve thedomain adaptability of a deletion-basedLong Short-Term Memory (LSTM) neuralnetwork model for sentence compression.We hypothesize that syntactic informationhelps in making such modelsmore robust across domains. We proposetwo major changes to the model: usingexplicit syntactic features and introducingsyntactic constraints through Integer LinearProgramming (ILP). Our evaluationshows that the proposed model works betterthan the original model as well as a traditionalnon-neural-network-based modelin a cross-domain setting.

Discipline

Databases and Information Systems | Programming Languages and Compilers

Publication

Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, 2017 July 30 - August 4

First Page

1385

Last Page

1393

Identifier

10.18653/v1/P17-1127

City or Country

Vancouver, Canada

Additional URL

https://doi.org/10.18653/v1/P17-1127

Share

COinS