"Densely connected bidirectional LSTM with applications to sentence cla" by Zixiang DING, Rui XIA et al.
 

Publication Type

Conference Proceeding Article

Book Title/Conference/Journal

Proceedings of the 7th CCF International Conference, NLPCC 2018 Hohhot, China, 2018 August 26-30

Year

8-2018

Abstract

Deep neural networks have recently been shown to achieve highly competitive performance in many computer vision tasks due to their abilities of exploring in a much larger hypothesis space. However, since most deep architectures like stacked RNNs tend to suffer from the vanishing-gradient and overfitting problems, their effects are still understudied in many NLP tasks. Inspired by this, we propose a novel multi-layer RNN model called densely connected bidirectional long short-term memory (DCBi-LSTM) in this paper, which essentially represents each layer by the concatenation of its hidden state and all preceding layers’ hidden states, followed by recursively passing each layer’s representation to all subsequent layers. We evaluate our proposed model on five benchmark datasets of sentence classification. DC-Bi-LSTM with depth up to 20 can be successfully trained and obtain significant improvements over the traditional Bi-LSTM with the same or even less parameters. Moreover, our model has promising performance compared with the state-of-the-art approaches.

Disciplines

Programming Languages and Compilers

Subject(s)

Applied or Integration/Application Scholarship

ISSN/ISBN

9783319995007

Publisher

Springer

DOI

10.1007/978-3-319-99501-4

Version

publishedVersion

Language

eng

Copyright Holder

Authors

Format

application/PDF

Additional URL

https://doi.org/10.1007/978-3-319-99501-4

Share

COinS