Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

7-2015

Abstract

We present a simple yet effective unsupervised domain adaptation method that can be generally applied for different NLP tasks. Our method uses unlabeled target domain instances to induce a set of instance similarity features. These features are then combined with the original features to represent labeled source domain instances. Using three NLP tasks, we show that our method consistently out-performs a few baselines, including SCL, an existing general unsupervised domain adaptation method widely used in NLP. More importantly, our method is very easy to implement and incurs much less computational cost than SCL.

Keywords

Computational linguistics, Linguistics, Domain adaptation, Computational costs, Target domain, Natural language processing systems

Discipline

Computer Sciences | Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing: Beijing, China, July 26-31, 2015

Volume

2

First Page

168

Last Page

173

ISBN

9781941643730

Identifier

10.3115/v1/P15-2028

Publisher

Association for Computational Linguistics

City or Country

Stroudsburg, PA

Copyright Owner and License

Publisher

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Additional URL

https://doi.org/10.3115/v1/P15-2028

Share

COinS