Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
5-2022
Abstract
Online participant recruitment platforms such as Prolific have been gaining popularity in research, as they enable researchers to easily access large pools of participants. However, participant quality can be an issue; participants may give incorrect information to gain access to more studies, adding unwanted noise to results. This paper details our experience recruiting participants from Prolific for a user study requiring programming skills in Node.js, with the aim of helping other researchers conduct similar studies. We explore a method of recruiting programmer participants using prescreening validation, attention checks and a series of programming knowledge questions. We received 680 responses, and determined that 55 met the criteria to be invited to our user study. We ultimately conducted user study sessions via video calls with 10 participants. We conclude this paper with a series of recommendations for researchers.
Discipline
Artificial Intelligence and Robotics
Research Areas
Intelligent Systems and Optimization
Areas of Excellence
Digital transformation
Publication
Proceedings of the RoPES ‘22: 1st International Workshop on Recruiting Participants for Empirical Software Engineering, Virtual Conference, May 17
First Page
1
Last Page
3
Identifier
10.48550/arXiv.2201.05348
City or Country
Pittsburgh, PA, USA
Citation
REID, Brittany; WAGNER, Markus; D’AMORIM, Marcelo; and TREUDE, Christoph.
Software engineering user study recruitment on Prolific: An experience report. (2022). Proceedings of the RoPES ‘22: 1st International Workshop on Recruiting Participants for Empirical Software Engineering, Virtual Conference, May 17. 1-3.
Available at: https://ink.library.smu.edu.sg/sis_research/10466
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://arxiv.org/abs/2201.05348