wordpress blog stats
Connect with us

Hi, what are you looking for?

Independent researchers claim Facebook is trying to kill their study on hate speech in India

The study was commissioned in mid-2020 but Facebook is delaying completion, say human rights groups involved.

After a damaging leak of internal company records triggered calls for regulatory actions against Meta, independent researchers are now claiming that Facebook is stifling a report they had been commissioned to submit on its human rights impact in India, as per a report in the Wall Street Journal. Meta has refuted the claims saying that they were being thorough with the report and not trying to meet an ‘arbitrary deadline’.

However, researchers claim that Facebook’s human rights team, which oversees the work on the report, has been raising technical objections, narrowing the scope, etc., in an attempt to stifle the report. We were approached by the company to provide factual evidence of toxic content and their thoughts on Facebook use in India, Dr. Ritumbra Mauvie, one of the researchers on the project, told MediaNama.

Facebook in India faces renewed scrutiny from the government, legislative committees, and lawmakers following recent revelations by employee-turned-whistleblower Frances Haugen.

Allegations of the researchers

According to the WSJ report, researchers alleged the following actions are being taken up through Foley Hoag, a New York-based law firm hired by Facebook to take charge of the report.

Moving of goalposts: Foley Hoag challenged the content flagged as hate speech by one of the human rights organisations, Stichting The London Story (TLS), behind the report. The law firm asked them whether they had first reported it on Facebook and then asked them if they reported it within specific time frames, in a shifting of goalposts.

Advertisement. Scroll to continue reading.

“In context of hate speech or human rights assessment of a company as big as Facebook it should not matter if the toxic content was found or flagged in a specific time frame. The fact that there was toxic content on the platform, which was widely viewed, and was not removed by Facebook despite user community flagging it – needs to be acknowledged,” Ritumbra Manuvie, the co-founder of TLS told MediaNama.

Further, the law firm asked TLS to prove that a piece of content had caused harm, which Manuvie said was a “higher bar than human rights impact assessments must typically meet and not in the spirit of assembling an independent report.”

Technical objections: Facebook raised technical objections on the report’s definition of hate speech and thus, the content flagged or included in the report, Ratik Asokan, an Indian Civil Watch researcher involved in the report, told MediaNama. Manuvie said that TLS had relied on Facebook’s definitions of hate speech, violence, etc., as per its content moderation policy.

Not removing flagged hateful material: Researchers claimed that they found and reported lots of hateful content on the platform, including a video that called for the extermination of Muslims and Islam and clocked 40 million views, but the content was not removed. However, a Facebook spokesperson denied this to WSJ.

“Facebook’s toxic content flagging systems are broken and Facebook has never given a detailed breakup of content it claims it has removed from the platform (for example we do not know what protocols Facebook followed, or how much of the content was in English language or Hindi or another non-English language, from which country they removed the content, etc are some of the unanswered questions),” Mauvie said.

How the study came to be

The researchers claim that the study was commissioned in mid-2020. Nick Clegg, Facebook’s VP of Global Affairs, revealed that Facebook had commissioned a Human Rights Impact Assessment “several months ago.” In response to a letter submitted by various Indian civil society groups asking Facebook to address dangerous content in India, Clegg said that Holey Foag would have “complete independence” in determining the methods and groups to consult but suggested the law firm incorporate feedback from individual Facebook users and vulnerable groups, WSJ reported.

In recent years, Facebook has released executive summaries of human rights impact assessments it commissioned on its operations in Indonesia, Sri Lanka, and Cambodia. In each instance, it said that the consultants who were engaged completed their work in less than one year, as per WSJ.

Facebook’s previous tussle with researchers

In August 2021, Facebook had banned the accounts of New York University researchers, threatened legal actions against researchers at Berlin Based Algorithm Watch, and restricted data access to researchers at Princeton University.

Advertisement. Scroll to continue reading.

NYU researchers: Facebook banned the personal accounts of researchers who were part of the NYU Ad Observatory as well as suspended the team’s access to Facebook’s Ad Library and Crowdtangle where they were trying to study political ad-targeting. Facebook alleged that the researchers were doing unauthorised data collection through a plug-in they developed for the research. Multiple civil rights groups, privacy activists, and a group of US senators challenged Facebook’s decision.

AlgorithmWatch: Researchers at AlgorithmWatch were issued legal threats by Facebook on their research of Instagram’s algorithm, specifically on how Instagram prioritises pictures and videos in a user’s timeline. Facebook claimed that AlgorithmWatch’s study was flawed, with various issues in the methodology and in breach of Facebook’s terms of service. They also claimed that it was in violation of the GDPR.

Princeton University: The University’s researchers withdrew from studying political advertising on the platform through its ‘Facebook Open Research and Transparency’ platform (FORT) due to stringent contractual obligations. This included Facebook getting to review their research before publication for removal of any information related to “Facebook’s products and technology, its data processing systems, policies, and platforms, in addition to personal information pertaining to its users or business partners.”

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Advertisement. Scroll to continue reading.
Written By

I cover health technology for MediaNama, among other things. Reach me at anushka@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ