Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

10-2022

Abstract

Recently, a new training oaxe loss has proven effective to ameliorate the effect of multimodality for non-autoregressive translation (NAT), which removes the penalty of word order errors in the standard cross-entropy loss. Starting from the intuition that reordering generally occurs between phrases, we extend oaxe by only allowing reordering between ngram phrases and still requiring a strict match of word order within the phrases. Extensive experiments on NAT benchmarks across language pairs and data scales demonstrate the effectiveness and universality of our approach. Further analyses show that ngram noaxe indeed improves the translation of ngram phrases, and produces more fluent translation with a better modeling of sentence structure.

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 29th International Conference on Computational Linguistics, Gyeongju, 2022 October 12-17

First Page

5035

Last Page

5045

Publisher

Association for Computational Linguistics

City or Country

Gyeongju, Republic of Korea

Additional URL

https://aclanthology.org/2022.coling-1.446/

Share

COinS