Facebook may have a fake news problem, but it's not interested in allowing third parties to solve it. A plugin designed to spot false information shared on Facebook was reportedly blocked by the social network on Friday.

The B.S. Detector, a Google Chrome and Firefox browser plugin designed by activist and independent journalist Daniel Sieradski, was made available after a deluge of fake news reports bombarded Facebook news feeds during the 2016 election. The service simply placed a red bar across stories from untrustworthy sources.

Sieradski described the plugin as "a rejoinder to Mark Zuckerberg’s dubious claims that Facebook is unable to substantively address the proliferation of fake news on its platform."

But instead of allowing the plugin to keep fake news from spreading, Facebook apparently decided to crack down on the plugin itself. According to tweets from Sieradski and experiences reported by other users, Facebook prevented the link to B.S. Detector—which it labeled as "malicious"—from behind shared on its platform.

According to an interview with Motherboard, Sieradski said his plugin had been installed about 25,000 times. Users who have B.S. Detector installed can still use it, and the website can still be visited, but sharing the link to the service via Facebook had been blocked.

Sieradski reported on Twitter that links to B.S. Detector were once again working when shared on Facebook, but the social network has remained silent as to why it spent the better half of Friday keeping its users from linking to the service.

"We maintain a set of systems to help us detect and block suspicious behavior on our site," a spokesperson for Facebook told IBTimes. "We temporarily blocked people from sharing the domain bsdetector.tech because of other abuse we have seen from the .tech top-level domain. We have corrected the error.”

Facebook has come under intense scrutiny since the 2016 election, during which the site became a breeding ground for false information. According to a report from BuzzFeed, fake stories received more engagement during the final weeks leading up to election day than real stories from trusted publications.

Many of the stories shared on the social media service came from sites created specifically to spread false information. Motivation ranged from trolling to profit—especially in the case of a group of Macedonian teens who turned making fake news for Trump supporters into a business model. Experts have also suggested fake news was manufactured by Russians as part of an extended propaganda campaign.

Despite these issues being most prevalent on Facebook, the company's CEO Mark Zuckerberg has insisted the site has little influence in determining the outcome of an election and claimed 99 percent of content seen on the platform is authentic. However, the company has promised to develop tools to help curb the spread of bad information.