Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
11-2020
Abstract
The prevalent use of social media enables rapid spread of rumors on a massive scale, which leads to the emerging need of automatic rumor verification (RV). A number of previous studies focus on leveraging stance classification to enhance RV with multi-task learning (MTL) methods. However, most of these methods failed to employ pre-trained contextualized embeddings such as BERT, and did not exploit inter-task dependencies by using predicted stance labels to improve the RV task. Therefore, in this paper, to extend BERT to obtain thread representations, we first propose a Hierarchical Transformer1 , which divides each long thread into shorter subthreads, and employs BERT to separately represent each subthread, followed by a global Transformer layer to encode all the subthreads. We further propose a Coupled Transformer Module to capture the inter-task interactions and a Post-Level Attention layer to use the predicted stance labels for RV, respectively. Experiments on two benchmark datasets show the superiority of our Coupled Hierarchical Transformer model over existing MTL approaches.
Discipline
Databases and Information Systems | Social Media
Research Areas
Data Science and Engineering
Publication
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Virtual Conference, November 16-20
First Page
1392
Last Page
1401
Identifier
10.18653/v1/2020.emnlp-main.108
Publisher
Association for Computational Linguistics
City or Country
Virtual Conference
Citation
1
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.