Effect of Transparency and Trust on Acceptance of Automatic Online Comment Moderation Systems

Brunk, Jens; Mattern, Jana; Riehle, Dennis M

Research article in digital collection (conference) | Peer reviewed

Abstract

User-generated online comments and posts increasingly contain abusive content that needs moderation from an ethical but also legislative perspective. The amount of comments and the need for moderation in our digital world often overpower the capacity of manual moderation. To remedy this, platforms often adopt semi-automated moderation systems. However, because such systems are typically black boxes, user trust in and acceptance of the system is not easily achieved, as black box systems can be perceived as nontransparent and moderating user comments is easily associated with censorship. Therefore, we investigate the relationship of system transparency through explanations, user trust and system acceptance with an online experiment. Our results show that the transparency of an automatic online comment moderation system is a prerequisite for user trust in the system. However, the objective transparency of the moderation system does not influence the user's acceptance.

Details about the publication

Name of the repositoryIEEE Xplore
Article number8808038
StatusPublished
Release year2019
Language in which the publication is writtenEnglish
Conference21st IEEE Conference on Business Informatics (CBI2019), Moscow, Russia
DOI10.1109/CBI.2019.00056
Keywordstransparency; trust; acceptance; automatic; comment-moderation; user posts

Authors from the University of Münster

Brunk, Jens
Chair of Information Systems and Information Management (IS)
Mattern, Jana
FB04 - School of Business and Economics (FB04)
Interorganisational Systems Group (IOS) (IOS)
Riehle, Dennis
Chair of Information Systems and Information Management (IS)