wordpress blog stats
Connect with us

Hi, what are you looking for?

What Facebook told the Delhi Peace and Harmony Committee, and what was left out

Facebook India’s Shivnath Thukral carefully avoided answering several questions

“By stonewalling our questions, by reserving your right to reply, you are frustrating the objectives of the committee.” Raghav Chadha told Shivnath Thukral, the head of the Public Policy team at Meta (formerly Facebook) at a hearing of the Delhi’s Peace and Harmony Committee. Thukral was appearing before the committee in relation to Facebook’s role in the 2020 Delhi Riots.

At the hearing, Chadha asked Thukral about the composition of content moderation teams in India, Facebook’s decision-making process for content moderation and distribution among other things. Thukral chose not to answer several questions related to the Delhi Riots and Facebook’s content moderation protocols referring to the fact that he was under no legal obligation to do so. It’s worth pointing out that in his deposition before same committee last year, journalist Paranjoy Guha Thakurta had suggested a close proximity between Thukral and the BJP.

Facebook has tried to avoid appearing before this committee: in September 2020, Facebook India CEO Ajit Mohan had moved the Supreme Court against the Delhi assembly’s summons, challenging its validity but the Court upheld the committee’s summons in July 2021. Facebook was summoned by the committee after a Wall Street Journal report last year revealed that the platform did not take action against hate speech posted on it by BJP leaders. So far, Facebook India has received three summons from the committee.

More recently, Facebook’s failures to effectively moderate hate speech in India were recently highlighted by Facebook whistleblower Frances Haugen’s striking revelations. Haugen highlighted that Facebook had a very small number of personnel working on hate speech moderation and had done little to control hate speech in languages other than English. This deposition essentially is the first public deposition for a Facebook executive in India, focused on hate speech leading to riots. Prior to this, Facebook India MD Ajit Mohan has also deposed before the Parliamentary Standing Committee on Information Technology.

On Facebook’s efforts to regulate hate speech

  • No hate speech definition for India: On the topic of hate speech, Chadha asked Thukral if Facebook has a definition of hate speech specifically dedicated for India, to which Thukral responded:

    Our community standards are a global document. Hate speech falls into one of the categories in our community standards. We do have definitions around it…  The definition is always evolving. For example, in India, we realized that caste could be a factor based on which hate speech could be put out.. We evolved hate speech guidelines to include caste slurs in India as well.  (1:08:00) – Shivnath Thukral

  • Facebook’s reluctance to take action: The committee also accused Facebook of being reluctant to take action in the context of hate speech:

    If I can talk through numbers, through investments of over 13 billion dollars, 5 billion dollars this year on safety and security. 40,000 people, including 15,000 in content moderation. Coverage of 70 languages, 20 languages in India for content moderation. I don’t think that’s a sign of reluctance. – Shivnath Thukral

    What the committee did not ask: Thukral claimed that Facebook covers 20 languages in India for content moderation. The committee did not ask if Facebook has provisions to detect and take down hate speech in these languages. Haugen recently highlighted that Facebook does not have hate speech classifiers for most Indian languages, the Indian Express reported.

    Advertisement. Scroll to continue reading.

    In response to MediaNama’s queries, a Facebook spokesperson told us that the company has hate speech classifiers ” in only four Indian languages – Hindi, Bengali, Tamil and Urdu.” This implies that Facebook cannot detect hate speech in 16 other languages for which Facebook is available in India.

On tackling fake news and misinformation

  • Organized disinformation campaigns: Chadha also asked Thukral about actions Facebook takes against ‘organized disinformation campaigns, to which Thukral replied:

    We have policies around organized behavior as well, and if we find violations of that protocol, we would take actions against those accounts as well. If we receive signals that someone is setting up multiple accounts to deceive users by pretending to be someone else, our team of cybersecurity experts will pick up those signals and we would end up removing those accounts. – Shivnath Thukral

    What the committee did not ask: Does Facebook enforce these takedown policies across the political spectrum, or has it failed to check coordinated campaigns by the ruling party in the past? Whistleblower Sophie Zhang claimed earlier this year that no action was taken on the coordinated network when it was discovered to be linked to a sitting BJP MP.

  • Fact checking: Chadha also asked Thukral about Facebook’s efforts to fact check content on its platform. Thukral pointed out that Facebook has a dedicated fact-checking initiative in India with 10 local fact-checking partners in India, and a capability to fact-check in 11 languages.
  • Reducing distribution: Chadha asked Thukral to explain how the distribution of a post is reduced, to which Thukral responded:

    In 2018, when we ran an experiment, we wanted to make sure that friends and family are getting to see meaningful interaction, so the experiment led to the reduction of virality of videos, for example, in effect reduction of engagement, but we still went ahead and did it… We want to give users to override algorithms, when we make the changes, we ensure that political content or content which riles up people goes down, despite reduction in engagement, we went ahead and did that change. – Shivnath Thukral

    What the committee did not ask: Has Facebook ever chosen to boost engagement at the cost of spreading false and divisive content? The Wall Street Journal reported in September that internal teams suggested several algorithmic changes to reduce divisive content, but CEO Mark Zuckerberg refused to make such changes if there was a trade-off with engagement.

Questions Meta’s Public Policy Head refused to answer

These are the questions Thukral refused to answer, either citing lack of awareness or his right not to comment:

  • Could you help us understand the commercial arrangement (between Facebook and fact-checking partners)?
  • How much time do the fact-checking partners take to fact check content?
  • What is the composition of your news partnership team? Who heads that team for India?
  • During the Delhi Riots 2020, many incindenairy and communally sensitive posts from India have got amplified on Facebook. What measures have been carried out by Facebook India to remove such posts?
  • Can you give me examples of posts that is borderline violative of community standards but should not be removed from Facebook?
  • Did your civil society partners interview or inquire with the victims of the Delhi riots 2020?
  • Facebook has claimed that on the basis of your experience with regards to the violence in Myanmar and Sri Lanka, your findings have reducing the spread of borderline content. Can you give us an account of the removing of such borderline content during the course of the Delhi Riots and after the Delhi Riots?
  • Hypothetically, let’s say there is a post in the United States saying African Americans are termites who have a conspiracy to take over the country and need to be exterminated. Will this post be considered to be hate speech to be immediately removed from the platform?
  • Who all comprise of Facebook’s global leadership?
  • How aware is the global CEO of the happenings in Delhi and how his platform has been used to further communal tensions and deepen the faultlines that exist?

Disclosure: MediaNama’s editor Nikhil Pahwa was among those who have deposed before the committee, prior to this Facebook deposition.

Also read:

Written By

Figuring out subscriptions and growth at MediaNama. Email: nishant@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ