Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
11-2020
Abstract
In this paper, we propose Cross-Thought, a novel approach to pre-training sequence encoder, which is instrumental in building reusable sequence embeddings for large-scale NLP tasks such as question answering. Instead of using the original signals of full sentences, we train a Transformer-based sequence encoder over a large set of short sequences, which allows the model to automatically select the most useful information for predicting masked words. Experiments on question answering and textual entailment tasks demonstrate that our pre-trained encoder can outperform state-of-the-art encoders trained with continuous sentence signals as well as traditional masked language modeling baselines. Our proposed approach also achieves new state of the art on HotpotQA (full-wiki setting) by improving intermediate information retrieval performance.
Discipline
Databases and Information Systems | Programming Languages and Compilers
Research Areas
Data Science and Engineering
Publication
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Virtual Conference, November 16-20
First Page
412
Last Page
421
Identifier
10.18653/v1/2020.emnlp-main.30
Publisher
Association for Computational Linguistics
City or Country
Virtual Conference
Citation
1
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://arxiv.org/pdf/2010.03652.pdf