Salience bias in crowdsourcing contests

Ho Cheung Brian Lee, Sulin Ba, Xinxin Li, Jan Stallaert

Research output: Contribution to journalArticlepeer-review

60 Scopus citations

Abstract

Crowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform's designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platform Kaggle, combined with survey data from actual Kaggle contest participants, we examine the role of a systematic bias, namely, the salience bias, in influencing the performance of the crowdsourcing workers and how the number of crowdsourcing workers moderates the impact of the salience bias on the outcomes of contests. Our results suggest that the salience bias influences the performance of contestants, including the winners of the contests. Furthermore, the number of participating contestants may attenuate or amplify the impact of the salience bias on the outcomes of contests, depending on the effort required to complete the tasks. Our results have critical implications for crowdsourcing firms and platform designers.

Original languageEnglish (US)
Pages (from-to)401-418
Number of pages18
JournalInformation Systems Research
Volume29
Issue number2
DOIs
StatePublished - Jun 1 2018

All Science Journal Classification (ASJC) codes

  • Management Information Systems
  • Information Systems
  • Computer Networks and Communications
  • Information Systems and Management
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Salience bias in crowdsourcing contests'. Together they form a unique fingerprint.

Cite this