Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
8-2021
Abstract
Pre-trained multilingual language models, e.g., multilingual-BERT, are widely used in cross-lingual tasks, yielding the state-of-the-art performance. However, such models suffer from a large performance gap between source and target languages, especially in the zero-shot setting, where the models are fine-tuned only on English but tested on other languages for the same task. We tackle this issue by incorporating language-agnostic information, specifically, universal syntax such as dependency relations and POS tags, into language models, based on the observation that universal syntax is transferable across different languages. Our approach, named COunterfactual SYntax (COSY), includes the design of SYntax-aware networks as well as a COunterfactual training method to implicitly force the networks to learn not only the semantics but also the syntax. To evaluate COSY, we conduct cross-lingual experiments on natural language inference and question answering using mBERT and XLM-R as network backbones. Our results show that COSY achieves the state-of-the-art performance for both tasks, without using auxiliary dataset.
Keywords
Computational linguistics, Natural language processing systems, Semantics
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering
Publication
ACL-IJCNLP 2021: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics
First Page
577
Last Page
589
ISBN
9781954085527
Identifier
10.18653/v1/2021.acl-long.48
Publisher
Association for Computational Linguistics
City or Country
Online
Citation
YU, Sicheng; ZHANG, Hao; NIU, Yulei; SUN, Qianru; and JIANG, Jing.
COSY: COunterfactual SYntax for cross-lingual understanding. (2021). ACL-IJCNLP 2021: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics. 577-589.
Available at: https://ink.library.smu.edu.sg/sis_research/6510
Copyright Owner and License
Publisher
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.18653/v1/2021.acl-long.48