wordpress blog stats
Connect with us

Hi, what are you looking for?

Facebook made exceptions for high-profile users, misled Oversight Board: Report

The report revealed that Facebook allowed users to get away with violating its own rules against sexual harassment and violence.

High-profile users on Facebook are exempt from some or all of the social media giant’s rules, according to internal company documents accessed by the Wall Street Journal. A company programme known as ‘cross-check’ or ‘Xcheck’ reportedly allows millions of VIP users to post on the platform unchecked by Facebook’s Community Standards. When questioned about whitelisting practices, the company also reportedly misled its Oversight Board, telling them that Xcheck only impacted ‘a small number of decisions’. 

Xcheck was initially intended as a quality control measure for action taken against high-profile accounts. The programme, however, ended up granting immunity to public figures from action, including in cases where their posts amounted to harassment or incited violence, according to a 2019 internal review of the company’s whitelisting practices accessed by WSJ. 

In the past, Facebook has routinely claimed that all users on its platforms are held to the same standards. But the WSJ report shows that when it comes to enforcing Facebook’s guidelines, the company maintains explicit segregation between ordinary users and VIPs. The company’s decision to mislead its own Oversight Board also raises questions about whether the Board can keep Facebook in check.  

Xcheck users got away with harassment, inflammatory claims: Facebook Internal Review

Facebook’s confidential 2019 internal review of the company’s whitelisting practices recounted multiple instances where public figures were treated differently when found in violation of the company’s guidelines. In 2020, Xcheck allowed posts that violated its rules to be viewed at least 16.4 billion times, WSJ reported. 

Sexual Harassment: In some instances, posts from whitelisted users that amounted to harassment were left unchecked. The review mentions the case of footballer Neymar Jr. who, in an attempt to defend himself against a rape accusation in 2019, posted a video that contained his correspondence with his accuser, including naked pictures of the accuser, on both Facebook and Instagram. Because of Xcheck, these posts which clearly violated the accuser’s consent were left on the platforms for an entire day before being taken down, by which time 58 million people had already seen it.

Advertisement. Scroll to continue reading.

Facebook’s policy on non-consensual intimate imagery is straightforward: that it should be deleted, and users who post such imagery immediately banned. Neymar Jr., however, was protected by Xcheck. According to the internal review, the video had serious harmful consequences for the accuser, including  “the video being reposted more than 6,000 times, bullying and harassment about her character.” 

Incitement to Violence: “When the looting starts, the shooting starts,” Donald Trump posted on Facebook in May 2020. An automated system designed to detect whether a post violated company policy had scored Trump’s post 90 out of 100, indicating a high likelihood of a violation. Before he was banned from Facebook in June this year, Trump was an Xcheck user, and Zuckerberg personally made the call that the post remains on the platform, as he later admitted publicly. 

How many users are part of Xcheck and under what criteria?

At least 5.8 million users were part of Xcheck in 2020, documents accessed by WSJ revealed. The 2019 internal review found that differential treatment under Xcheck was both widespread and ‘not publicly defensible’. 

Most Facebook employees were allowed to add people to Xcheck, and 45 teams from across the world had been involved in whitelisting practices. An internal guide to Xcheck eligibility cited that users who were “newsworthy,” “influential or popular” or “PR risky,” could be added. Users were not notified that they had been added to the whitelist. 

While there were rough guidelines on who belonged in Xcheck, Facebook had no clear-cut rules or strict criteria for whitelisting. The Xcheck whitelist is “scattered throughout the company, without clear governance or ownership,” according to Facebook’s ‘Get Well Plan’ from 2020 accessed by WSJ.  

Did Facebook mislead its own Oversight Board about Xcheck?

In May this year, Facebook’s Oversight Board upheld the company’s decision to suspend Donald Trump’s account. The board made 19 recommendations for future action in its verdict, one of which was to report on error rates and consistency of determinations made through the Xcheck process, as opposed to ordinary enforcement procedures. 

Advertisement. Scroll to continue reading.

A month after these recommendations, Facebook told the Oversight Board that Xcheck was used for a small number of decisions, and refused to follow the recommendation. “It’s not feasible to track this information,” Facebook wrote in its responses.

In a written statement to WSJ, a spokesperson for the Oversight Board mentioned that the board “has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts.” 

Facebook identified issues itself, is working to phase Xcheck out: Spokesperson

When asked about Xcheck, Facebook spokesperson Andy Stone told WSJ that criticism of Facebook was fair, but Xcheck was designed “to create an additional step so we can accurately enforce policies on content that could require more understanding.” Stone also emphasised that Facebook found these issues itself, and is working to resolve them:

A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them – Facebook spokesperson Andy Stone

After the WSJ report was published, Stone took to Twitter and said, “In the end, at the center of this story is Facebook’s own analysis that we need to improve the program. We know our enforcement is not perfect and there are tradeoffs between speed and accuracy.” 

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Advertisement. Scroll to continue reading.
Written By

Figuring out subscriptions and growth at MediaNama. Email: nishant@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ