Abstract
Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators' digital labor, how transparency design could support creators' learning, as well as implications for transparency design of other creator platforms.
Original language | English (US) |
---|---|
Article number | 44 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 7 |
Issue number | CSCW1 |
DOIs | |
State | Published - Apr 16 2023 |
All Science Journal Classification (ASJC) codes
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications