"Multi-level head-wise match and aggregation in transformer for textual" by Shuohang WANG, Yunshi LAN et al.
 

Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

2-2020

Abstract

Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vectorrepresentation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary

Discipline

Artificial Intelligence and Robotics | Databases and Information Systems | Numerical Analysis and Scientific Computing

Research Areas

Data Science and Engineering

Publication

Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, February 7-12

First Page

9209

Last Page

9216

Identifier

10.1609/aaai.v34i05.6458

Publisher

AAAI

City or Country

New York

Copyright Owner and License

Publisher

Additional URL

https://doi.org/10.1609/aaai.v34i05.6458

Plum Print visual indicator of research metrics
PlumX Metrics
  • Usage
    • Downloads: 22
    • Abstract Views: 8
  • Captures
    • Readers: 27
see details

Share

COinS