Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2022

Abstract

Crowdsourcing is an effective means of accomplishing human intelligence tasks by leveraging the collective wisdom of crowds. Given reports of various accuracy degrees from workers, it is important to make wise use of these reports to derive accurate task results. Intuitively, a task result derived from a sufficient number of reports bears lower uncertainty, and higher uncertainty otherwise. Existing report aggregation research, however, has largely neglected the above uncertainty issue. In this regard, we propose a novel report aggregation framework that defines and incorporates a new confidence measure to quantify the uncertainty associated with tasks and workers, thereby enhancing result accuracy. In particular, we employ a link analysis approach to propagate confidence information, subgraph extraction techniques to prioritize workers, and a progressive approach to gradually explore and consolidate workers’ reports associated with less confident workers and tasks. The framework is generic enough to be combined with existing report aggregation methods. Experiments on four real-world datasets show it improves the accuracy of several competitive state-of-the-art methods.

Keywords

crowdsourcing, report aggregation, confidence propagation, experimental evaluation

Discipline

Databases and Information Systems | Electrical and Computer Engineering

Research Areas

Data Science and Engineering

Publication

Proceedings of the 2022 IEEE International Conference on Services Computing (SCC), Barcelona, Spain, July 10-16

First Page

1

Last Page

10

Identifier

10.1109/SCC55611.2022.00051

Publisher

IEEE

City or Country

Barcelona, Spain

Additional URL

https://doi.org/10.1109/SCC55611.2022.00051

Share

COinS