Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
2-2020
Abstract
Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vectorrepresentation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary
Discipline
Artificial Intelligence and Robotics | Databases and Information Systems | Numerical Analysis and Scientific Computing
Research Areas
Data Science and Engineering
Publication
Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, February 7-12
First Page
9209
Last Page
9216
Identifier
10.1609/aaai.v34i05.6458
Publisher
AAAI
City or Country
New York
Citation
WANG, Shuohang; LAN, Yunshi; TAY, Yi; JIANG, Jing; and LIU, Jingjing.
Multi-level head-wise match and aggregation in transformer for textual sequence matching. (2020). Proceedings of the 34th AAAI Conference on Artificial Intelligence, New York, February 7-12. 9209-9216.
Available at: https://ink.library.smu.edu.sg/sis_research/5601
Copyright Owner and License
Publisher
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1609/aaai.v34i05.6458
Included in
Artificial Intelligence and Robotics Commons, Databases and Information Systems Commons, Numerical Analysis and Scientific Computing Commons