Micro-blogging platform Koo released its third compliance report on September 1 as mandated by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Koo’s report details information it detected through automated tools, complaints it received and took action on, and – a first in the platform’s compliance reports thus far – numbers on spam content it detected and removed.
Overall, numbers pertaining to ‘removed content’ remained low, with most of the Koos (the platform’s term for posts/tweets) being subject to ‘Other Action’ like overlay, blur, ignore, warn. Interestingly, unlike compliance reports by intermediaries like Facebook and Google, Koo does not reveal the categories these reports fall under such as bullying, harassment, nudity, violence, etc.
In Numbers: Koo’s content moderation
The compliance report provides the following details on community grievance reports and proactive content removals in August 2021:
Community-based spam moderation: In its first disclosure of category-specific moderation, Koo said it received reports on 984 posts being spam content. Out of this, 45 were removed while the rest faced ‘Other Actions’ such as being hidden, blacklisted, etc.
Pro-active spam moderation: It detected 5,052 Koos as spam. Out of this, 145 were removed while the rest faced ‘Other Actions’.
Community-based moderation: The platform said that 4,493 Koos were reported by the community. This is only slightly higher than its reported number for July (3,431). Out of 4,493, 1,244 were removed while the rest faced ‘Other Actions’.
Proactive or automated moderation: Koo proactively detected 38,456 Koos violating its content moderation policies, around 50% less than the number it detected in July (65,280). Out of 38,456, 1,220 were removed and the rest faced ‘Other Actions’.
Koo’s content moderation policies
As noted before, Koo does not provide content moderation numbers by specific category/guidelines. But it does enforce guidelines for content under the following categories:
- Hate Speech and Discrimination
- Religiously Offensive Content
- Terrorist and Extremist Content
- Violent Content
- Graphic, Obscene, and Sexual Content
- Sexual Harassment and non-consensual nudity
- Cyberbullying
- Malicious Programmes
- Intellectual Property
- Spamming, Scamming, and Phishing
- Misinformation and Fake News
- Identity Theft and Impersonation
- Invasion of Privacy
- Illegal Activities
Koo’s run-ins with content moderation
In February, when Koo first came into public view after several Union Ministers began signing up, there were reports of hate speech and conspiracy theories on the platform.
In response, Koo had said that it would:
- Set up an oversight board that will provide direction in content moderation while dealing with “grey areas” in select cases.
- ‘Harness the power of the community’ to moderate content.
- Use a combination of machine learning and community reporting to moderate content in vernacular languages.
What the IT Rules 2021 require
The IT Rules require social media intermediaries to:
- Proactively identify and take down content: This includes content moderation (through automated mechanisms) of posts that are defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender, and other types.
- Publish periodic compliance reports: These reports should be published every month and have details of complaints received, action taken, and “other relevant information”.
- Appoint key managerial roles: Significant social media intermediaries (with more than 50 lakh registered users) must appoint a chief compliance officer, nodal contact person, and resident grievance officer, all of whom must be Indian residents and employees of the platform.
- Disable content within 36 hours of government order: The Rules also ask intermediaries to provide information for verification of identity or assist any government agency for crime prevention and investigations no later than 72 hours of receiving a lawful order. They also have to preserve records of disabled content for 180 days.
Also Read:
- Interview: Koo co-founder Mayank Bidwatka on content moderation, comparison with Parler, and focus on mobile experience
- Koo takes down 498 posts from its platform in July; down by 60 percent compared to June
- WhatsApp, Facebook, and Instagram publish compliance reports mandated by IT Rules
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover health technology for MediaNama, among other things. Reach me at anushka@medianama.com
