We have published a series of research articles examining user studies, including implementation suggestions.
Some of our findings and recommendations are:
- Fair pay: Paying crowdworkers adequately is task dependent
- Real Users: Students are overused in academic studies; there are workable alternates to get real users
- Data Validity: There is an assumption of data validity about people that is often wrong
- Enriching Suirveys: Combining surveys with think-aloud provides rich qualitative insights
- Participant Selection: Crowdworker platforms vary in quality, and all need careful piloting of recruitment and quality control
Read more in these articles!
Salminen, J., Sayed Kamel, A.M., Jung, S.G., Mustak, M. and Jansen, B. J. (2022) Fair compensation of crowdsourcing work: the problem of flat rates. Behaviour & Information Technology, 27 (1), 3-35.DOI: 10.1080/0144929X.2022.2150564
Salminen, J., Jung, S.G., Kamel, A., Froneman, W., and Jansen, B. J. (2022) Who is in the sample? An analysis of real and surrogate users as participants in user study research in the information technology fields. PeerJ Computer Science. 8:e1136 https://doi.org/10.7717/peerj-cs.1136
Jansen, B. J., Salminen, J., Jung, S.G., and Almerekhi, H. (2022) The Illusion of Data Validity: Why Numbers About People Are Likely Wrong. Data and Information Management. 6(4), 100020.
Nielsen, L., Salminen, J., Jung, S.G., and Jansen, B. J. (2021) Think-Aloud Surveys – A Method for Eliciting Enhanced Insights During User Studies, INTERACT 2021 18th IFIP TC13 International Conference on Human–Computer Interaction. 30 Aug. – 3 Sept. Bari, Italy.
Salminen J., Jung S., and Jansen B.J. (2021) Suggestions for Online User Studies. In: Stephanidis C. et al. (eds) HCI International 2021 – Late Breaking Papers: Design and User Experience. HCII 2021. Lecture Notes in Computer Science, vol. 13094. pp 127-146. https://doi.org/10.1007/978-3-030-90238-4_11