ABSTRACT

Online venues of public or semi-public communication such as social media, news websites’ comments sections or dating apps depend on volunteer contributions of user-generated content – images, videos or texts – that are provided as a form of free online labor. Citizens use platforms for a variety of ends – to participate politically, to assemble around specific topics such as self-help communities, or to engage in citizen journalism. Yet, media content is inherently subject to the very rules that govern it. Modes of governance may include top-down approaches where platform moderators, often working under precarious labour conditions regulate and moderate user-generated content. By contrast, peer production sites such as Wikipedia rest upon principles in which citizens self-organize to create and set up regimes of content creation and moderation, where volunteers provide free labour for the common good of a particular community. This ‘civic labor’ serves threefold stake holders: platform participants, other moderators, and the platform itself. As platforms grow, however, governance in democratic peer production settings may gravitate towards oligarchy. Platform moderation policies and censorship mechanisms in both commercial and volunteer platforms can clash with user interests. Therefore, due process, or the protection of the rights of contributors in cases where speech rights are curtailed by social media takedowns, is attracting more attention. This entry provides an overview of research and debates on content moderation and its social relevance at the intersection of platform power and users – between liability mitigation and social media platform cultures, and between high degrees of content moderation and little interference. It then focuses on Reddit as an exemplary case of how content is moderated and managed – both through voluntary labor and professional content moderators – and explores the social and political ramifications of such mechanisms of platform governance.