There has been a 13% increase in content that was deleted through automated detection, said Google in its Information Technology (IT) Rules, 2021 compliance report for the month of August.
The IT Rules mandate significant social media intermediaries (SSMIs) to publish periodic compliance reports. Like in the compliance report for July, Google mentions that the content has been removed from “all its platforms”, but does not specify which platforms are covered by the report.
In August, Google removed 651,933 content through automated detection, while in July it removed 576,892. Google claims that the automated detection technology is being used to detect content such as “child sexual abuse, violent extremist content’. For the automated detection process, Google uses —
- Location data of the sender or creator of the content
- Location of account creation
- IP address at the time of video upload
- User phone number
Content removal based on user complaints decreases marginally
According to the compliance report for August, there has been a 2.22% reduction in the content that was removed for the month, when compared to that of July. The decrease in content removal can be attributed to a fewer number of complaints received from users.
In August, Google removed 93,550 content based on 35,191 user complaints, in July, the platform removed 95,680 items based on 36,934 complaints. While most of the now-removed content was regarding copyright violations (99.1%), other items that were removed were based on court orders, graphic sexual content, and so on.
Why more actions removed versus complaint received? Google said, “A single complaint may specify multiple URLs, referred to in the report as “items”, that potentially relate to the same or different pieces of content.”
The majority of the complaints received, too, were pertaining to alleged copyright violations (96.2%). The rest of the complaints were about trademark infringement (1.2%), defamation (1.2%), court orders, etc. Here is a comprehensive look at what was removed —
What do the IT Rules say?
Rule 4(d) of the Rules specify that Significant Social Media Intermediaries (who have more than 5 million users) must:
publish periodic compliance report every month mentioning the details of complaints received and action taken thereon, and the number of specific communication links or parts of information that the intermediary has removed or disabled access to in pursuance of any proactive monitoring conducted by using automated tools or any other relevant information as may be specified.
“To allow sufficient time for data processing and validation, there will be a two-month lag for reporting,” Google had said in its June report, adding that it would include more granular data in subsequent reports.
Also read
- Google Will Comply With India’s New IT Rules: Sundar Pichai
- Relying On Automated Takedowns, YouTube Removed More Videos In April-June Than It Ever Has
- Relying on automated detection, Google removes 576,892 content from all platforms: July compliance report
Have something to add? Post your comment and gift someone a MediaNama subscription.
Among other subjects, I cover the increasing usage of emerging technologies, especially for surveillance in India
