Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
7-2020
Abstract
Event Detection (ED) is a fundamental task in automatically structuring texts. Due to the small scale of training data, previous methods perform poorly on unseen/sparsely labeled trigger words and are prone to overfitting densely labeled trigger words. To address the issue, we propose a novel Enrichment Knowledge Distillation (EKD) model to leverage external open-domain trigger knowledge to reduce the in-built biases to frequent trigger words in annotations. Experiments on benchmark ACE2005 show that our model outperforms nine strong baselines, is especially effective for unseen/sparsely labeled trigger words. The source code is released on https://github.com/shuaiwa16/ekd.git.
Discipline
Databases and Information Systems | Graphics and Human Computer Interfaces
Research Areas
Data Science and Engineering
Publication
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Virtual Conference, 2020 July 5-10
First Page
5887
Last Page
5897
Identifier
10.18653/v1/2020.acl-main.522
Publisher
Association for Computational Linguistics
City or Country
Virtual Conference
Citation
TONG, Meihan; XU, Bin; WANG, Shuai; CAO, Yixin; HOU, Lei; LI, Juanzi; and XIE, Jun.
Improving event detection via open-domain event trigger knowledge. (2020). Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Virtual Conference, 2020 July 5-10. 5887-5897.
Available at: https://ink.library.smu.edu.sg/sis_research/7450
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
http://doi.org/10.18653/v1/2020.acl-main.522
Included in
Databases and Information Systems Commons, Graphics and Human Computer Interfaces Commons