CrowdEval: A cost-efficient strategy to evaluate crowdsourced worker's reliability

Chenxi Qiu, Anna Squicciarini, Dev Rishi Khare, Barbara Carminati, James Caverlee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Scopus citations

Abstract

Crowdsourcing platforms depend on the quality of work provided by a distributed workforce. Yet, it is challenging to dependably measure the reliability of these workers, particularly in the face of strategic or malicious behavior. In this paper, we present a dynamic and efficient solution to keep tracking workers' reliability. In particular, we use both gold standard evaluation and peer consistency evaluation to measure each worker performance, and adjust the proportion of the two types of evaluation according to the estimated distribution of workers' behavior (e.g., being reliable or malicious). Through experiments over real Amazon Mechanical Turk traces, we find that our approach has a significant gain in terms of accuracy and cost compared to state-of-the-art algorithms.

Original languageEnglish (US)
Title of host publication17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018
PublisherInternational Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS)
Pages1486-1494
Number of pages9
ISBN (Print)9781510868083
StatePublished - 2018
Event17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018 - Stockholm, Sweden
Duration: Jul 10 2018Jul 15 2018

Publication series

NameProceedings of the International Joint Conference on Autonomous Agents and Multiagent Systems, AAMAS
Volume2
ISSN (Print)1548-8403
ISSN (Electronic)1558-2914

Other

Other17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018
Country/TerritorySweden
CityStockholm
Period7/10/187/15/18

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'CrowdEval: A cost-efficient strategy to evaluate crowdsourced worker's reliability'. Together they form a unique fingerprint.

Cite this