Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

7-2020

Abstract

Event Detection (ED) is a fundamental task in automatically structuring texts. Due to the small scale of training data, previous methods perform poorly on unseen/sparsely labeled trigger words and are prone to overfitting densely labeled trigger words. To address the issue, we propose a novel Enrichment Knowledge Distillation (EKD) model to leverage external open-domain trigger knowledge to reduce the in-built biases to frequent trigger words in annotations. Experiments on benchmark ACE2005 show that our model outperforms nine strong baselines, is especially effective for unseen/sparsely labeled trigger words. The source code is released on https://github.com/shuaiwa16/ekd.git.

Discipline

Databases and Information Systems | Graphics and Human Computer Interfaces

Research Areas

Data Science and Engineering

Publication

Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Virtual Conference, 2020 July 5-10

First Page

5887

Last Page

5897

Identifier

10.18653/v1/2020.acl-main.522

Publisher

Association for Computational Linguistics

City or Country

Virtual Conference

Additional URL

http://doi.org/10.18653/v1/2020.acl-main.522

Share

COinS