Skip to main content

Site navigation

  • University of Technology Sydney home
  • Home

    Home
  • For students

  • For industry

  • Research

Explore

  • Courses
  • Events
  • News
  • Stories
  • People

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt
  • Study at UTS

    • arrow_right_alt Find a course
    • arrow_right_alt Course areas
    • arrow_right_alt Undergraduate students
    • arrow_right_alt Postgraduate students
    • arrow_right_alt Research Masters and PhD
    • arrow_right_alt Online study and short courses
  • Student information

    • arrow_right_alt Current students
    • arrow_right_alt New UTS students
    • arrow_right_alt Graduates (Alumni)
    • arrow_right_alt High school students
    • arrow_right_alt Indigenous students
    • arrow_right_alt International students
  • Admissions

    • arrow_right_alt How to apply
    • arrow_right_alt Entry pathways
    • arrow_right_alt Eligibility
arrow_right_altVisit our hub for students

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Apply for a coursearrow_right_alt
  • Current studentsarrow_right_alt
  • Scholarshipsarrow_right_alt
  • Featured industries

    • arrow_right_alt Agriculture and food
    • arrow_right_alt Defence and space
    • arrow_right_alt Energy and transport
    • arrow_right_alt Government and policy
    • arrow_right_alt Health and medical
    • arrow_right_alt Corporate training
  • Explore

    • arrow_right_alt Tech Central
    • arrow_right_alt Case studies
    • arrow_right_alt Research
arrow_right_altVisit our hub for industry

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Partner with usarrow_right_alt
  • Explore

    • arrow_right_alt Explore our research
    • arrow_right_alt Research centres and institutes
    • arrow_right_alt Graduate research
    • arrow_right_alt Research partnerships
arrow_right_altVisit our hub for research

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Research centres and institutesarrow_right_alt
  • University of Technology Sydney home
Explore the University of Technology Sydney
Category Filters:
University of Technology Sydney home University of Technology Sydney home
  1. home
  2. arrow_forward_ios ... Newsroom
  3. arrow_forward_ios ... 2024
  4. arrow_forward_ios 03
  5. arrow_forward_ios It’s too easy to get around Facebook’s content policies

It’s too easy to get around Facebook’s content policies

22 March 2024

New research shows that even after Facebook made changes to stem the tide of dangerous pandemic misinformation, some accounts continued to thrive write Amelia Johns, Emily Booth, Francesco Bailo and Marian-Andrei Rizoiu.

Stock picture of a detail of Facebook App section false news report, on a smartphone.

Picture: Adobe Stock

During the COVID pandemic, social media platforms were swarmed by far-right and anti-vaccination communities that spread dangerous conspiracy theories.

These included the false claims that vaccines are a form of population control, and that the virus was a “deep state” plot. Governments and the World Health Organization redirected precious resources from vaccination campaigns to debunk these falsehoods.

As the tide of misinformation grew, platforms were accused of not doing enough to stop the spread. To address these concerns, Meta, the parent company of Facebook, made several policy announcements in 2020–21. However, it hesitated to remove “borderline” content, or content that didn’t cause direct physical harm, save for one policy change in February 2021 that expanded the content removal lists.

To stem the tide, Meta continued to rely more heavily on algorithmic moderation techniques to reduce the visibility of misinformation in users’ feeds, search and recommendations – known as shadowbanning. They also used fact-checkers to label misinformation.

While shadowbanning is widely seen as a concerningly opaque technique, our new research, published in the journal Media International Australia, instead asks: was it effective?

What did we investigate?

We used two measures to answer this question. First, after identifying 18 Australian far-right and anti-vaccination accounts that consistently shared misinformation between January 2019 and July 2021, we analysed the performance of these accounts using key metrics.

Second, we mapped this performance against five content moderation policy announcements for Meta’s flagship platform, Facebook.

The findings revealed two divergent trends. After March 2020 the overall performance of the accounts – that is, their median performance – suffered a decline. And yet their mean performance shows increasing levels after October 2020.

This is because, while the majority of the monitored accounts underperformed, a few accounts overperformed instead, and strongly so. In fact, they continued to overperform and attract new followers even after the alleged policy change in February 2021.

Shadowbanning as a badge of pride

To examine why, we scraped and thematically analysed comments and user reactions from posts on these accounts. We found users had a high motivation to stay engaged with problematic content. Labelling and shadowbanning were viewed as motivating challenges.

Specifically, users frequently used “social steganography” – using deliberate typos or code words for key terms – to evade algorithmic detection. We also saw conspiracy “seeding” where users add links to archiving sites or less moderated sites in comments to re-distribute content Facebook labelled as misinformation, and to avoid detection.

In one example, a user added a link to a BitChute video with keywords that dog-whistled support for QAnon style conspiracies. As terms such as “vaccine” were believed to trigger algorithmic detection, emoji or other code names were used in their place:

A friend sent me this link, it’s [sic.] refers to over 4000 deaths of individuals after getting 💉 The true number will not come out, it’s not in the public’s interest to disclose the amount of people that have died within day’s [sic.] of jab.

While many conspiracy theories were targeted at government and public health authorities, platform suppression of content fuelled further conspiracies regarding big tech and their complicity with “Big Pharma” and governments.

This was evident in the use of keywords such as MSM (“mainstream media”) to reference QAnon style agendas:

MSM are in on this whole thing, only report on what the elites tell them to. Clearly you are not doing any research but listening to msm […] This is a completely experimental ‘vaccine’.

Another comment thread showed reactions to Meta’s dangerous organisations policy update, where accounts that regularly shared QAnon-content were labelled “extremist”. In the reactions, MSM and “the agenda” appeared frequently.

Some users recommended that sensitive content be moved to alternative platforms. We observed one anti-vaccination influencer complain that their page was being shadowbanned by Facebook, and calling on their followers to recommend a “good, censorship free, livestreaming platform”.

The replies suggested moderation-lite sites such as Rumble. Similar recommendations were made for Twitch, a livestreaming site popular with gamers which has since attracted far-right political influencers.

As one user said:

I know so many people who get censored on so many apps especially Facebook and Twitch seems to work for them.

How can content moderation fix the problem?

These tactics of coordination to detect shadowbans, resist labelling and fight the algorithm provide some insight into why engagement didn’t dim on some of these “overperforming” accounts despite all the policies Meta put in place.

This shows that Meta’s suppression techniques, while partially effective in containing the spread, do nothing to prevent those invested in sharing (and finding) misinformation from doing so.

Firmer policies on content removal and user banning would help address the problem. However, Meta’s announcement last year suggests the company has little appetite for this. Any loosening of policy changes will all but ensure this misinformation playground will continue to thrive.

Amelia Johns, Associate Professor, Digital and Social Media, School of Communication, University of Technology Sydney; Emily Booth, Research assistant, University of Technology Sydney; Francesco Bailo, Lecturer, Digital and Social Media, University of Sydney, and Marian-Andrei Rizoiu, Associate Professor in Behavioral Data Science, University of Technology Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Share
Share this on Facebook Share this on Twitter Share this on LinkedIn
Back to Technology and design

Related News

  • A woman looking at a mobile phone in bed. Picture: Adobe Stock
    The ins and outs of misinformation rabbit holes
  • Outside Victorian State Government offices - Extinction Rebellion Declaration Day Melbourne March 22, 2019. Picture by John Englart on Flickr, CC BY-SA 2.0
    How to inoculate against climate misinformation? Prebunk it!
  • Boy looking at his phone
    Government regulation can effectively curb social media dangers

Acknowledgement of Country

UTS acknowledges the Gadigal People of the Eora Nation and the Boorooberongal People of the Dharug Nation upon whose ancestral lands our campuses now stand. We would also like to pay respect to the Elders both past and present, acknowledging them as the traditional custodians of knowledge for these lands. 

University of Technology Sydney

City Campus

15 Broadway, Ultimo, NSW 2007

Get in touch with UTS

Follow us

  • Instagram
  • LinkedIn
  • YouTube
  • Facebook

A member of

  • Australian Technology Network
Use arrow keys to navigate within each column of links. Press Tab to move between columns.

Study

  • Find a course
  • Undergraduate
  • Postgraduate
  • How to apply
  • Scholarships and prizes
  • International students
  • Campus maps
  • Accommodation

Engage

  • Find an expert
  • Industry
  • News
  • Events
  • Experience UTS
  • Research
  • Stories
  • Alumni

About

  • Who we are
  • Faculties
  • Learning and teaching
  • Sustainability
  • Initiatives
  • Equity, diversity and inclusion
  • Campus and locations
  • Awards and rankings
  • UTS governance

Staff and students

  • Current students
  • Help and support
  • Library
  • Policies
  • StaffConnect
  • Working at UTS
  • UTS Handbook
  • Contact us
  • Copyright © 2025
  • ABN: 77 257 686 961
  • CRICOS provider number: 00099F
  • TEQSA provider number: PRV12060
  • TEQSA category: Australian University
  • Privacy
  • Copyright
  • Disclaimer
  • Accessibility