Facebook has expanded its attempts to remove the controversial QAnon conspiracy theory from all of its platforms in one of the broadest content moderation exercises the company has ever embarked upon.
Facebook will now ban all content representing the group and is also taking action against people exploiting child safety concerns to indoctrinate people in the conspiracy theory, likened by many to a cult.
It builds on a policy announced in August that led to around 1500 pages, groups and profiles linked to QAnon being deleted for discussing or promoting violence.
Facebook will now expand it so that all QAnon content, not just posts with the potential to promote or incite violence, are removed from the platform.
RELATED: Facebook hits back at Netflix film
“Starting today we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” Facebook’ said via blog post on Wednesday morning.
“Our Dangerous Organisations Operations team is starting to enforce this updated policy today and is removing content accordingly, but this work will take time and will continue in the coming days and weeks,” the company added.
The QAnon conspiracy theory began on more underground websites and message boards like 4Chan and later 8kun, before spreading to Reddit, and then exploding into the mainstream via Facebook and Twitter.
The movement posits many, many things, but there is no real central leadership (the “Q” character at the centre is, as you might expect, Anonymous), and newly indoctrinated followers are essentially encouraged to talk themselves into it by “doing their own research” (though they’re often poked and prodded in the “right” direction).
The basic (extremely oversimplified) gist of the theory is that there exists in the world a secret cabal of Satan-worshipping paedophile elites, but their time is running out as Donald Trump will soon expose and dismantle their operation.
This has led to real-world violence in some cases, but on a more everyday human level, people are losing their loved ones to the radical theory as they fall down the rabbit hole of QAnon content.
There are also several QAnon followers running in the US election next month.
The old adage that when the US sneezes Australia catches a cold appears to apply, and some QAnon adherents have been pushing our politicians to embrace the conspiracy too.
RELATED: China’s fury over porn ‘rumour’
Last week Gizmodo reported Qld MP George Christensen used a Facebook page to follow a number of QAnon accounts (though this doesn’t mean he shares the same views or they’ve influenced his political position or even they’d been posting that sort of content when he liked them). The page has since been deleted.
Facebook said it had “specialists who study and respond to new evolutions in violating content from this movement” and it was actively looking to remove content from the platform rather than waiting for users to report it.
“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” Facebook said.
It cited other forms of real world harm than just the incitement of violence, such as recent QAnon claims that wildfires in the US were being deliberately lit “by certain groups” (similar theories were spread on the social media site during Australia’s horror summer bushfires, but according to at least one former employee Facebook is generally more interested in what happens in the US).
“Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another,” Facebook added.
One movement hijacked by QAnon is the protection of children.
Ordinarily you’d be hard pressed to find any reasonable person in opposition to protecting children from harm, which is probably the reason it’s been co-opted by QAnon through the #SaveTheChildren hashtag.
“We began directing people to credible child safety resources when they search for certain child safety hashtags last week – and we continue to work with external experts to address QAnon supporters using the issue of child safety to recruit and organise,” Facebook said.
The company added it expects to see “renewed attempts to evade our detection” and will continue studying “the impact of our efforts and be ready to update our policy and enforcement as necessary”.
Searching for the hashtag now directs users to a statement from the humanitarian organisation Save The Children.
“While people may choose to use our organisation’s name as a hashtag to make their point on different issues, we are not affiliated or associated with any of these campaigns,” the statement reads.
REAL WORLD CONSEQUENCES
The online conspiracy theory has crossed from the URL to the IRL through incidents of violence and protests, but there are other potential real-world consequences for participating in the baseless and bizarre movement.
Just ask the now former technology manager for investment bank Citigroup Jason Gelinas.
Mr Gelinas was placed on paid leave in September when it emerged he was allegedly behind a QAnon website and its associated mobile apps.
The website aggregated “Q drops” – posts from the anonymous “Q” character who supposedly has high-level government security clearance – making them easier to find and navigate as well as presenting them in one place.
“Our code of conduct includes specific policies that employees are required to adhere to, and when breaches are identified, the firm takes action.”
His website has been shut down and now links to other QAnon sites.
But it appears he hasn’t necessarily been fired for spreading the conspiracy posts, just for making money off them.
Bloomberg reports that CitiGroup’s code of conduct forbids employees engaging in outside business activity that makes them money without getting the all clear from their managers first.
A Patreon crowd-funding site reportedly brought in $US3000 ($A4200) a month, which Mr Gelinas claimed was used to meet the costs of running the website.