Oversight board slams Facebook, Instagram for special treatment of high-profile users
WASHINGTON: An oversight panel said on Tuesday Facebook and Instagram put business over human rights when giving special treatment to rule-breaking posts by politicians, celebrities and other high-profile users.
A year-long probe by an independent “top court” created by the tech firm ended with it calling for the overhaul of a system known as “cross-check” that shields elite users from Facebook’s content rules.
“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the panel said in a report.
“By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”
Cross-check is implemented in a way that does not meet Meta’s human rights responsibilities, according to the board.
Meta told the board the program is intended to provide an additional layer of human review to posts by high-profile users that initially appear to break rules for content, the report indicated.
ALSO READ | Received 55,497 requests for user data from Indian government: Meta
That has resulted in posts that would have been immediately removed being left up during a review process that could take days or months, according to the report.
“This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm,” the board said.
Meta also failed to determine whether the process had resulted in more accurate decisions regarding content removal, the board said.
Cross-check is flawed in “key areas” including user equality and transparency, the board concluded, making 32 recommended changes to the system.
Content identified as violating Meta’s rules with “high severity” in a first assessment “should be removed or hidden while further review is taking place,” the board said.
ALSO READ | Facebook’s parent company Meta planning massive layoffs: Report
“Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity.”
The Oversight Board said it learned of cross-check in 2021, while looking into and eventually endorsing Facebook’s decision to suspend former US president Donald Trump.
A year-long probe by an independent “top court” created by the tech firm ended with it calling for the overhaul of a system known as “cross-check” that shields elite users from Facebook’s content rules.
“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the panel said in a report.
“By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”
Cross-check is implemented in a way that does not meet Meta’s human rights responsibilities, according to the board.
Meta told the board the program is intended to provide an additional layer of human review to posts by high-profile users that initially appear to break rules for content, the report indicated.
ALSO READ | Received 55,497 requests for user data from Indian government: Meta
That has resulted in posts that would have been immediately removed being left up during a review process that could take days or months, according to the report.
“This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm,” the board said.
Meta also failed to determine whether the process had resulted in more accurate decisions regarding content removal, the board said.
Cross-check is flawed in “key areas” including user equality and transparency, the board concluded, making 32 recommended changes to the system.
Content identified as violating Meta’s rules with “high severity” in a first assessment “should be removed or hidden while further review is taking place,” the board said.
ALSO READ | Facebook’s parent company Meta planning massive layoffs: Report
“Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity.”
The Oversight Board said it learned of cross-check in 2021, while looking into and eventually endorsing Facebook’s decision to suspend former US president Donald Trump.
For all the latest Technology News Click Here