• Facebook , Twitter and YouTube have also imposed bans on QAnon content
  • Trump himself has refused to disavow the QAnon movement or its supporters
  • QAnon adherent, Georgia Republican Marjorie Taylor Greene, is running for Congress and is expected to win

Three weeks ahead of the U.S. election, video sharing app TikTok said it would ban any accounts that broadcast content from QAnon, the far-right online group that promotes various conspiracy theories.

The measure is designed to curb the dissemination of baseless conspiracies and disinformation during the election.

TikTok previously targeted accounts that featured specific hashtags related to QAnon – now it is prohibiting any and all content deemed to have originated in the far-right movement.

"Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform," a TikTok spokesperson told NPR. "We've also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our community guidelines."

TikTok told Forbes magazine that its policy on QAnon had been in place “for a while.”

Other social media giants, including Facebook (FB), Twitter (TWTR) and YouTube, which is owned by Google, a subsidiary of Alphabet (GOOG), have also imposed bans on QAnon content.

QAnon, which began in October 2017, has attracted enormous interest largely due to the power and reach of social media.

Among other conspiracies, QAnon has claimed that German Chancellor Angela Merkel is the granddaughter of Adolph Hitler, and that Barack Obama, Hilary Clinton and George Soros have been planning a coup to topple President Donald Trump.

Trump himself has refused to disavow the QAnon movement or its supporters.

One prominent QAnon adherent, Georgia Republican Marjorie Taylor Greene, is running for Congress and is expected to win.

"There should be recognition of a thing that is good and significant, even if it's long overdue," Angelo Carusone, president of Media Matters for America, a liberal nonprofit watchdog group, told NPR. "TikTok is recognizing that by the nature of the QAnon movement, you can't just get rid of their communities, the content itself is the problem. We're talking about hundreds of millions of video views just for a limited segment of QAnon communities that we identified.”

Media Matters reported that just 14 QAnon-related hashtags on TikTok amassed 488 million views by June of this year.

However, Hany Farid, a computer science professor at the University of California at Berkeley, said banning something like QAnon can give it even more publicity and notoriety.

"But the [QAnon] movement got big enough and dangerous enough that people were looking at the landscape and saying, 'Yeah, this is completely out of control,'" he said. "Were they slow to do it? Probably. But [social] platforms get criticized when they act too quickly. So there is a dilemma there."

TikTok has about 100 million monthly active users in the U.S.

TikTok's community guidelines specify misinformation that "causes harm to individuals, our community or the larger public" is prohibited on the site.

Carusone worries QAnon supporters and other conspiracy theorists could find ways around the ban by manipulating or disguising hashtags.

"The test of this policy will be how much it affects the creation and germination of new QAnon content on TikTok," Carusone said. "If you know your video is going to be eliminated before it has a chance to spread, you're less likely to spend time polluting the TikTok pool."

Meanwhile, TikTok faces other problems – Trump has ordered the Chinese-owned app to find American buyers by Nov. 12, citing security concerns.

Last month a federal judge temporarily placed a block on the White House’s efforts to close down the app.

TikTok is owned by Beijing-based ByteDance, which has agreed to form a US subsidiary called TikTok Global, part-owned by Oracle (ORCL) and Walmart (WMT).

It remains unclear how the merger agreement or the White House ban will be affected by the election.

Abishur Prakash, a geopolitical futurist based in Toronto, told International Business Times that by banning QAnon, TikTok is playing a new role in the U.S. election.

“The company is deciding what kind of political content users should have access to -- weeks before the presidential election,” he said. “Until recently, only U.S. social media companies had this power. Now, a Chinese platform does too. Of course, because TikTok originates from China, there will always be questions as to whether certain kinds of decisions, including censoring specific topics, have geopolitical motives.”