The Journal of Communication has just published a new article led by Johannes Kaiser (a brilliant colleague who was a post-doc at the Online Civic Culture Centre at Loughborough University with funding from the Swiss National Science Foundation), Andrew Chadwick, and myself. The article is published open access, which means everyone can freely read and download it.
In this study, based on two experiments on representative sample of social media users in Germany, we investigate how people use the “block” and “unfollow” (or “unfriend”) function on social media against their friends who report misinformation. In particular, we ask whether users use these functions differently depending on whether the friend who reports misinformation has similar or different political views to their own.
We find that, indeed, users are substantially and significantly more likely to block their friends who share misinformation if those friends have different political views to their own. We argue that these patterns, in turn, have important implications for network polarization, i.e., the degree to which the other users with whom we are in contact on social media tend to cluster based on political preferences. By disproportionately cleaning up their social media feeds from disagreeing friends who share misinformation, users may end up reducing the diversity of the content they encounter, whether accurate or not. As we argue in the conclusion:
“Partisan blocking derives from a confluence of other users’ norm violations and popular social media affordances originally introduced to grant people greater control over their online experiences. Even when used by citizens to protect themselves from misinformation shared by their online friends, blocking and unfriending can end up disproportionately severing ties to politically dissimilar others. At the same time, because politically similar friends who share inaccurate content are less likely to be blocked, partisan blocking does little to solve the problem of users who continue to push misinformation to their like-minded online friends.“
I also had the pleasure to speak about our research with Shraddha Chakradhar from the Nieman Lab, who wrote an excellent article that illustrates the results of our research. The final quote from the article nicely summarizes one of the key implications of our work:
“I think that probably the most important takeaway is that there are some drawbacks to the widespread assumption that one of the best ways to protect people against disinformation is to give users tools that enable them to limit contact with other people who share misinformation,” Vaccari told me. “If people applied those tools in a politically neutral way, then there would be no problem with that argument. But the problem, as this study shows, is that people apply those blocking and unfollowing tools in a way that is partisan.”
Full citation: Johannes Kaiser, Cristian Vaccari, Andrew Chadwick, Partisan Blocking: Biased Responses to Shared Misinformation Contribute to Network Polarization on Social Media, Journal of Communication, Volume 72, Issue 2, April 2022, Pages 214–240, https://doi.org/10.1093/joc/jqac002