Popular instant messaging platform WhatsApp banned over 2 million accounts linked to Indian numbers in December 2021, according to its latest compliance report. In the same month, WhatsApp restored 23 banned accounts in response to requests made by users through its grievance officer.
Under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, social media intermediaries, like WhatsApp, have to publish monthly compliance reports showing the number of grievances received as well as the amount of content that was proactively detected and taken down.
WhatsApp has become a medium for the spread of misinformation in India, according to several news reports. A recent investigation also reported that there was a proliferation of inauthentic behaviour among its users. Data from compliance reports give an indication of WhatsApp’s content moderation efforts in India.
User grievances received over safety issues and ban appeals
According to the report, WhatsApp received 528 user grievances through email or post to its grievance officer. Out of this, it responded to 34 grievances, most of which were ban appeals while one fell under the category of ‘Other support’. It denotes requests which cannot be consistently classified otherwise, the report said. In November, WhatsApp had received 602 grievances from Indian users, and it responded to 36 of them – all of which appealed bans imposed on accounts.
Here’s a category-wise breakdown of all grievances:
User grievances related to product support and safety are redirected to the app’s in-built reporting feature and actions taken therein are not included in the number of accounts actioned in response to grievances. The platform puts NA against various categories as actioning of an account may not be applicable in those cases, the report said.
Number of accounts banned and the mechanism used to decide
Exactly 2,07,900 accounts were banned by WhatsApp in December 2021. The platform classifies a banned account as Indian if it has a +91 country code preceding it. Furthermore, the report said that these accounts are proactively detected at registration, during messaging, and in response to ‘negative feedback’ like user reports and blocks. In November, WhatsApp had banned 1.7 million accounts from its platform.
What the IT Rules 2021 require
The IT Rules require social media intermediaries to:
- Publish periodic compliance reports: These reports should be published every month and have details of complaints received, action taken, and “other relevant information”.
- Appoint key managerial roles: Significant social media intermediaries (with more than 50 lakh registered users) must appoint a chief compliance officer, nodal contact person, and resident grievance officer, all of whom must be Indian residents and employees of the platform.
- Proactively identify and take down content: This includes content moderation (through automated mechanisms) of posts that are defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender, and other types.
- Disable content within 36 hours of government order: The Rules also ask intermediaries to provide information for verification of identity or assist any government agency for crime prevention and investigations no later than 72 hours of receiving a lawful order. They also have to preserve records of disabled content for 180 days.
Also Read:
- Over 1.7 million WhatsApp accounts banned, latest compliance report shows
- WhatsApp’s new feature will give more power to group chat admins
- Instagram took action against 12 million sexually exploitative content: Report
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover health technology for MediaNama, among other things. Reach me at anushka@medianama.com
