Here’s How Anti-Vaxxers Are Spreading Misinformation Despite Your Best Moderation Efforts
What moderation tactics have you used or seen as a mechanism to curtail the spread of misinformation in communities and on social media platforms? Word detection, link blocking, and digital stickers promoting legitimate information sources may immediately come to mind.
But what would happen if you ran your moderation tools against URLs shared in link-in-bio services used in your community? Or what if you learned that folks on your platform were using specific codewords to circumvent word detection? Or posting screenshots of misinformation rather than using plain-text? People are getting creative with how they share all types of information online, misinformation included. Are our moderation strategies keeping up?
In this discussion, Patrick chats with Joseph Schafer, an undergraduate student of Computer Science and Ethics at the University of Washington and Rachel Moran, a postdoctoral fellow at the University of Washington’s Center for an Informed Public. They discuss their research and how anti-vaccine advocates are circumventing content moderation efforts on Facebook, Instagram, Twitter, and large social networks. Some of their findings might surprise you! For example, specific folk theories have emerged that define how some believe social platforms and algorithms work to moderate their content and conversations. And whether these theories are true or not, the strategies forming around them do seem to help people keep questionable content up long enough for researchers to come across it.
So, where do we start? How can we detect misinformation if people are using codewords like pizza or Moana to get around our tools and teams? There may not be precise solutions here just yet, but Rachel and Joseph both offer ideas to help us down the right path, which starts with deciding that the engagement that brews around misinformation is not safe for the long-term health of your community.
Among our topics:
- Why Linktree needs community guidelines and how link-in-bio sites have become a vector for misinformation
- The folk theories that are informing how we perceive and operate around social media algorithms
- Adapting your moderation strategies to better find misinformation
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.Big Quotes
Using lexical variation to circumvent moderation filters (2:45): “They found this big group of people who were using ‘dancing’ or other kinds of verbs to mean getting the vaccine. Complete replacement of the word [vaccine]. You wouldn’t know that that meant vaccination unless you were a member of that community and had the institutional knowledge that comes with being a member. We see [lexical variation] on a spectrum.” –@rachelemoran
Emojis, code words, and symbols can form the insider language of a community (3:08): “We see ‘v@ccine’ where the A is an @ sign or people using the vaccine emoji rather than using the word at all. They believe that if they put that instead of spelling out vaccine, … they’ll avoid being caught up in the algorithmic moderation that happens on platforms.” –@rachelemoran
Misinformation finds a hiding place in link-in-bios (5:05): “There’s a variety of ways that you can … get around [link blocks]. One might be, for example, using a screenshot of an article or something that is vaccine misinformation, rather than putting in the text of the misinformation directly. … There’s also various websites like URL shorteners or URL compilers, or even just a Word document … that is filled with links to sites that maybe these major platforms are moderating and blocking.” –@joey__schafer
Using vaccination promotion tools to promote anti-vaccine content (10:56): “[On Instagram stories, you can use] that little sticker that says, ‘Let’s get vaccinated.’ Then Instagram collates those of your friends that have [used that] sticker … and it goes at the top of your [stories section]. … [We’re seeing people] put a sticker over the top of that sticker or they are like, ‘Let’s not get vaccinated.'” –@rachelemoran
The engagement surrounding misinformation isn’t good for the long-term health of your community or your business (32:06): “Part of the problem with misinformation is that it’s really engaging. When you’re making money off of engagement, there’s only so far you’re going to go to take down misinformation without going too far into your bottom line. … I feel like there is a tide-turning moment happening where the bigger platforms are realizing that misinformation is a vulnerability that degrades the product that can have economic disadvantages.” –@rachelemoranAbout Joseph Schafer and Rachel Moran
Joseph Schafer is a fourth-year undergraduate student at the University of Washington, studying Computer Science and Ethics. He has also worked as a research assistant for the university’s Center for an Informed Public since January of 2020, studying various forms of online misinformation and disinformation. Joseph hopes to pursue graduate school in information science, in order to understand how misinformation takes advantage of recently developed socio-technical systems, like social media, to influence our society.
Rachel Moran is a postdoctoral fellow at the University of Washington’s Center for an Informed Public. Moran received her doctoral degree from the Annenberg School for Communication and Journalism at the University of Southern California. Her research explores the role of trust in digital information environments and is particularly concerned with how trust is implicated in the spread of mis- and dis-information. Her research has been published in academic journals and been covered by the New York Times, Vox, Vice, and others. She was also an affiliate fellow at George Washington University’s Institute for Data, Democracy, and Politics and UNC Chapel Hills’ Center for Information, Technology, and Public Life.Related Links
- Sponsor: Vanilla, a one-stop-shop for online community
- Joseph Schafer on Twitter
- Joseph Schafer’s website
- Rachel Moran on Twitter
- University of Washington’s Center for an Informed Public
- Content moderation avoidance strategies, via The Virality Project
- Anti-vaccine groups changing into ‘dance parties’ on Facebook to avoid detection, via NBC News
- Linktree’s community guidelines
- First I “like” it, then I hide it: Folk Theories of Social Feeds
- Dr. Jennifer Beckett on Community Signal
- A top spreader of coronavirus misinformation says he will delete his posts after 48 hours, via the New York Times
- Election Integrity Partnership, which Joseph and Rachel both worked on
- Jay Rosen on Community Signal
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.