Go viral and go toxic: the use and abuse of social media

Date
2024
DOI
Authors
Ling, Chen
Version
OA Version
Citation
Abstract
After the COVID-19 pandemic, social media have become even more important in our lives. At the same time, the rapid dissemination of information online poses challenges to content quality and exposes social media platforms to various risks. This makes content moderation an important challenge. Current moderation systems on social media are facing a constantly changing ecosystem. In particular, abusive attacks are shaped by technology and social events. Misinformation and harassment can be implicit and context-dependent, making the detection challenging. Different online communities have different characteristics and often abuse unfolds across multiple communities. This dissertation addresses the critical need for innovative moderation schemes that help with the current approach which relies heavily on human moderators for precise moderation. Given the increasing prevalence of multi-modal content, like images and videos, detecting toxicity in these modalities using automated tools is challenging. Due to these challenges, a prioritization scheme based on the virality of the content by a mixed-method and multi-modal approach can reinforce an effective human-in-the-loop moderation system. In this dissertation, I first present my work on measuring toxicity across social media through two case studies: one is Zoombombing, and the other is drug abuse videos, to get a better understanding of them. In both cases, I observe a mixture of misinformation, dangerous, and hateful content left unmoderated. Then, I evaluate the effectiveness of the existing social media moderation system regarding COVID-19. I find that it is largely text-aware and contradictory, relying on human-in-the-loop for precise moderation. Finally, I examine features that promote content's virality to prioritize potential popular content to facilitate the decision-making process of content by human moderators. Overall, this dissertation presents a novel prioritization scheme that incorporates mixed-method and multi-modal techniques, offering a significant advancement for the human-in-the-loop moderation processes in social media. By prioritizing content based on its potential virality, this approach empowers content moderators to make more informed decisions, thereby enhancing the overall efficacy of moderation systems in combating online toxicity.
Description
License
Attribution 4.0 International