WhatsApp has banned over 3 million Indian accounts from its platform between June and July 2021, a new compliance report by the social media platform revealed. Facebook, Instagram, and WhatsApp all released their monthly compliance reports on August 31 as mandated by the Information Technology (Intermediaries Guidelines And Digital Media Ethics Code) Rules, 2021.
Overall, the reports show an increase in the amount of content that the platforms have flagged (through their automated systems) and taken action against as well as the number of user grievances received as compared to their earlier report which covered May and June 2021. These monthly reports show that major social media platforms like Facebook and WhatsApp are looking to stay in compliance with the requirements of the IT Rules as failure to do so could lead to platforms losing immunity under the IT Act 2000.
Content moderation by WhatsApp
The 3 million accounts banned by WhatsApp from its platform were done through automatic detection and in-app reporting by its users. However this number, WhatsApp said, does not include the number of user grievances received through emails and mails by WhatsApp’s Grievance Officer in India, Paresh B. Lal.
WhatsApp received 594 user grievances across topics of Account support, Ban appeal, Product support, Safety, and others. Out of the 594 grievances, 74 were actioned which, according to the report, could mean banning an account or restoring a previously banned one.
The rest of the 490 were not actioned against due to one or more of the following reasons:
- The user required assistance to access their account
- The user required assistance to use one of WhatsApp’s features
- The user wrote to provide feedback
- The user requested restoration of a banned account and the request was denied.
- The reported account did not violate the laws of India or WhatsApp’s Terms of Service
Content moderation by Facebook and Instagram
Facebook and Instagram disclosed their user requests and content actioning numbers in another report together. On almost all counts the two had detected and actioned against higher numbers of problematic content as well as received a high number of user grievances as compared to their last report.
Content actioned by Facebook
Here’s a breakdown of all problematic content Facebook took action against along with the percentage of such content its automated systems flagged:
Content actioned by Instagram
As opposed to Facebook, Instagram does not have a metric for spam content yet. Here’s a breakdown of all the other content it reviewed:
Source: Facebook monthly compliance report
User grievances received by Facebook
All in all, Facebook said that it received 1,504 user grievances through the contact form on its website and its Grievance Officer in India, Spoorthi Priya. Of these all were responded to, however subsequent action for resolution was taken on 1,326 cases which included providing tools to the user to report content for specific violations, self-remediation flows where they can download their data, avenues to address account hacked issues etc.
The remaining 178 grievances were subject to specialised reviews by Facebook after which 44 grievances were actioned against. These actions include removing a post, covering it with a warning, or disabling the account.
Here’s a breakdown, by topic, of these grievances:
User grievances received by Instagram
Instagram received 265 user grievances – a sharp rise from the 36 grievances it noted in its earlier report. Of these, 181 users were provided with tools for resolution while the remaining 84 were subject to specialised review after which 18 were actioned against.
What the IT Rules 2021 require
The IT rules require social media intermediaries to:
- Proactively identify and take down content: This includes content moderation (through automated mechanisms) of posts that are defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender, and other types.
- Publish periodic compliance reports: These reports should be published every month and have details of complaints received, action taken, and “other relevant information”.
- Appoint key managerial roles: Significant social media intermediaries (with more than 50 lakh registered users) must appoint a chief compliance officer, nodal contact person, and resident grievance officer, all of whom must be Indian residents and employees of the platform.
- Disable content within 36 hours of government order: The Rules also ask intermediaries to provide information for verification of identity or assist any government agency for crime prevention and investigations no later than 72 hours of receiving a lawful order. They also have to preserve records of disabled content for 180 days.
Also read:
- Facebook compliance reports show WhatsApp banned 2mn Indian users
- Twitter’s compliance report reveals 94 grievances from May 26 to June 26
- A review of OTT streaming services compliance with IT Rules 2021 in August
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover health technology for MediaNama, among other things. Reach me at anushka@medianama.com
