Brunk, Jens; Mattern, Jana; Riehle, Dennis M
Research article in digital collection (conference) | Peer reviewedUser-generated online comments and posts increasingly contain abusive content that needs moderation from an ethical but also legislative perspective. The amount of comments and the need for moderation in our digital world often overpower the capacity of manual moderation. To remedy this, platforms often adopt semi-automated moderation systems. However, because such systems are typically black boxes, user trust in and acceptance of the system is not easily achieved, as black box systems can be perceived as nontransparent and moderating user comments is easily associated with censorship. Therefore, we investigate the relationship of system transparency through explanations, user trust and system acceptance with an online experiment. Our results show that the transparency of an automatic online comment moderation system is a prerequisite for user trust in the system. However, the objective transparency of the moderation system does not influence the user's acceptance.
| Brunk, Jens | Chair of Information Systems and Information Management (IS) |
| Mattern, Jana | FB04 - School of Business and Economics (FB04) Interorganisational Systems Group (IOS) (IOS) |
| Riehle, Dennis | Chair of Information Systems and Information Management (IS) |
Duration: 07/02/2019 - 31/01/2022 Funded by: MKW - EFRE-Wettbewerb Neue Leitmärkte - CreateMedia.NRW Type of project: Individual project | |
Duration: 01/05/2015 - 30/04/2019 Funded by: EC H2020 - Marie Skłodowska-Curie Actions - Research and Innovation Staff Exchange Type of project: EU-project hosted at University of Münster |