An independent oversight board created by Meta is calling on the Facebook-parent led by Mark Zuckerberg to overhaul its special handling of content posted by VIPs
An independent oversight board created by Meta is calling on the Facebook-parent led by Mark Zuckerberg to overhaul its special handling of content posted by VIPs AFP

An oversight panel said Tuesday that Facebook and Instagram put business over human rights when giving special treatment to rule-breaking posts by politicians, celebrities and other high-profile users.

A year-long probe by an independent "top court" created by the tech firm ended with it calling for the overhaul of a system known as "cross-check" that shields elite users from Facebook's content rules.

"While Meta told the board that cross-check aims to advance Meta's human rights commitments, we found that the program appears more directly structured to satisfy business concerns," the panel said in a report.

"By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm."

Cross-check is implemented in a way that does not meet Meta's human rights responsibilities, according to the board.

Meta told the board the program is intended to provide an additional layer of human review to posts by high-profile users that initially appear to break rules, the report indicated.

That has resulted in posts that would have been immediately removed being left up during a review process that could take days or months, according to the report.

"This means that, because of cross-check, content identified as breaking Meta's rules is left up on Facebook and Instagram when it is most viral and could cause harm," the board said.

Meta also failed to determine whether the process had resulted in more accurate decisions regarding content removal, the board said.

Cross-check is flawed in "key areas," including user equality and transparency, the board concluded, recommending 32 changes to the system.

Content identified as violating Meta's rules with "high severity" in a first assessment "should be removed or hidden while further review is taking place," the board said.

"Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity."

The Oversight Board said it learned of cross-check in 2021, while looking into and eventually endorsing Facebook's decision to suspend former US president Donald Trump.

In a statement Tuesday, Facebook Vice President for Global Affairs Nick Clegg said the firm has agreed with the board to review its recommendations and respond within 90 days.

He said in the past year, Facebook has made improvements to the process, including widening eligibility for cross-check reviews while also implementing more controls on how users are added to the system.

"We built the cross-check system to prevent potential over-enforcement... and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe," such as journalistic reporting from conflict zones, Clegg said.