Publication Type
Journal Article
Version
publishedVersion
Publication Date
10-2023
Abstract
In this work, we investigate the connection between browsing behavior and task quality of crowdsourcing workers performing annotation tasks that require information judgements. Such information judgements are often required to derive ground truth answers to information retrieval queries. We explore the use of workers’ browsing behavior to directly determine their annotation result quality. We hypothesize user attention to be the main factor contributing to a worker’s annotation quality. To predict annotation quality at the task level, we model two aspects of task-specific user attention, also known as general and semantic user attentions . Both aspects of user attention can be modeled using different types of browsing behavior features but most previous research mostly focuses on the former. This work therefore proposes to model semantic user attention by capturing the worker’s understanding of task content using task-semantics specific behavior features. We develop a web-based annotation interface for gathering user behavior data when workers perform a knowledge path retrieval task. With the collected data, we train several prediction models using behavior features corresponding to different aspects of user attention and conduct experiments on a set of annotation tasks performed by 51 Amazon Mechanical Turk workers. We show that the prediction model using both general and semantic user attention features can achieve the best performance of nearly 75% accuracy.
Keywords
Crowdsourcing, Machine Learning, Annotations, User Modeling, Empirical Study
Discipline
Databases and Information Systems | Numerical Analysis and Scientific Computing
Research Areas
Data Science and Engineering
Publication
IEEE Access
First Page
1
Last Page
16
ISSN
2169-3536
Identifier
10.1109/ACCESS.2022.3212080
Publisher
Institute of Electrical and Electronics Engineers
Citation
LO, Pei-chi and LIM, Ee-peng.
Your cursor reveals: On analyzing workers’ browsing behavior and annotation quality In crowdsourcing tasks. (2023). IEEE Access. 1-16.
Available at: https://ink.library.smu.edu.sg/sis_research/7950
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.
Additional URL
https://doi.org/10.1109/ACCESS.2022.3212080
Included in
Databases and Information Systems Commons, Numerical Analysis and Scientific Computing Commons