TikTok is the latest social media company to go after the QAnon conspiracy theory.
The mega-popular video app is now deleting accounts that post QAnon-related content, NPR reports, an escalation from its previous policy of reducing the visibility of the posts.
“Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform,” the company told NPR. “We’ve also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines.”
QAnon’s content often focuses on undocumented claims of a Satanic conspiracy between Democrats and pedophiles.
TikTok’s move comes less than a week after Google-owned YouTube announced that it would ban QAnon and Pizzagate videos that target individuals and groups in order “to justify real-world violence.”
It also arrives on the heels of Facebook’s decision to remove hundreds of QAnon-affiliated groups over concerns that they could threaten public safety.
The social-media giant said earlier this month that it took down more than 790 groups, 100 pages and 1,500 ads linked to the far-right movement.
The Chinese-owned TikTok is currently in a battle to save its app from being banned in the US by the Trump administration, which has accused it of spying on Americans for the Chinese government.
Parent company ByteDance is still working to finalize a deal to spin off the app’s American operations. The proposal would set up a new US-based company called TikTok Global that would be partially owned by American investors including software firm Oracle and retail chain Walmart, who would have a combined 20 percent stake.
President Trump imposed a Nov. 12 deadline for the sale, after which TikTok’s operations would essentially halt in the US.