TY - JOUR
T1 - The face of internet recruitment
T2 - Evaluating the labor markets of online crowdsourcing platforms in China
AU - Li, Xiaojun
AU - Shi, Weiyi
AU - Zhu, Boliang
N1 - Funding Information:
Earlier versions of the paper were presented at the 2016 American Political Science Association mini-conference on Chinese Politics and University of California, San Diego, Global Policy and Strategy Junior Brownbag Seminar. We are grateful to Jesse Driscoll, Jean Oi, Eric Plutzer, Molly Roberts, Yiqing Xu, and conference and seminar participants for helpful comments and suggestions.
Funding Information:
This publication was made possible (in part) by a grant from Carnegie Corporation of New York. The statements made and views expressed are solely the responsibility of the authors.
Publisher Copyright:
© The Author(s) 2018.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - Zhubajie/Witmart and other online crowdsourcing platforms have proliferated in China, and researchers have increasingly used them for subject recruitment. One critical question remains, however: what is the generalizability of the findings based on these online samples? In this study, we benchmark the demography of an online sample from Zhubajie to nationally representative samples and replicate commonly asked attitudinal questions in national surveys. We find that online respondents differ from the general population in many respects. Yet, the differences become smaller when comparison is made with the internet users in benchmark surveys. Importantly, when predicting attitudes, our online sample with post-stratification weights is able to produce similar coefficients in most cases as these internet-active subsamples. Our study suggests that online crowdsourcing platforms can be a useful tool for subject recruitment, especially when researchers are interested in making inferences about Chinese netizens. We further analyze the political and social desirability issues of online subjects. Finally, we discuss caveats of using crowdsourcing samples in China.
AB - Zhubajie/Witmart and other online crowdsourcing platforms have proliferated in China, and researchers have increasingly used them for subject recruitment. One critical question remains, however: what is the generalizability of the findings based on these online samples? In this study, we benchmark the demography of an online sample from Zhubajie to nationally representative samples and replicate commonly asked attitudinal questions in national surveys. We find that online respondents differ from the general population in many respects. Yet, the differences become smaller when comparison is made with the internet users in benchmark surveys. Importantly, when predicting attitudes, our online sample with post-stratification weights is able to produce similar coefficients in most cases as these internet-active subsamples. Our study suggests that online crowdsourcing platforms can be a useful tool for subject recruitment, especially when researchers are interested in making inferences about Chinese netizens. We further analyze the political and social desirability issues of online subjects. Finally, we discuss caveats of using crowdsourcing samples in China.
UR - http://www.scopus.com/inward/record.url?scp=85048752073&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048752073&partnerID=8YFLogxK
U2 - 10.1177/2053168018759127
DO - 10.1177/2053168018759127
M3 - Article
AN - SCOPUS:85048752073
SN - 2053-1680
VL - 5
JO - Research and Politics
JF - Research and Politics
IS - 1
ER -