*Update (February 26): This post has been updated to represent the final version of the Rules that were notified on Thursday evening.
The government has stuck to its stance of traceability on internet platforms. The Information Technology (Guidelines For Intermediaries And Digital Media Ethics Code) Rules, 2021 propose to make it mandatory for social media intermediaries to enable identification of the originator of a particular message or post. This ostensibly challenges end-to-end encryption on messaging platforms such as WhatsApp and Signal, and hence raises concerns around privacy.
Note: This summary was initially based on a copy of the Rules that was released by the Internet Freedom Foundation, and had been circulating in journalist groups. The final Rules have differences that we have outlined below as and when relevant.
The Rules have prescribed due diligence to internet intermediaries as defined under Information Technology Act, 2000.
- The Rules have defined a social media intermediary as one that “primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services”. (Note: The previous version of the Rules had excluded from this definition intermediaries that enable commercial or business-oriented transactions; provide access to internet; search engines, online directories or online storage services. This distinction no longer exists in the final version.)
- The Rules do not define “significant social media intermediary”, and the threshold will be notified by the central government.
- Additionally, significant social media intermediaries need to have a physical contact address in India that is published on its website or mobile app.
- Once the Rules are notified, they will have three months to comply with additional due diligence.
1. Traceability mandate: “Significant social media intermediaries” will have to enable identification of the “first originator of the information” as required by a judicial order passed by “a court of competent jurisdiction” or an order passed under Section 69A of the IT act.
- Last resort mechanism: An order seeking identification shall not be passed in cases “where less intrusive means are effective in identifying the originator”. Essentially, intermediaries would be expected to identify the originator only if all other methods are likely to be futile.
- The guidelines say that such an order can only be passed for “purposes of prevention, detection, investigation, prosecution or punishment of an offence related to the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order, or of incitement to an offence relating to the above or in relation with rape, sexually explicit material or child sexual abuse material, punishable with imprisonment for a term of not less than five years”
- Need not disclose message content: Significant social media intermediaries, while complying with an order to identify the first originator a message, would not be asked to disclose the content of that message or any other information related to the originator or other users.
- When information originates outside India: When the originator of a particular piece of information is found to be located outside India, the first originator of that information in Indian territory will be deemed the originator for the purposes of this clause.
2. Proactively identify, take down content using automated tools: Significant social media intermediary “shall endeavour” (possibly optional) to deploy technology-based measures such as automated tools to proactively identify information that depicts rape, child sexual abuse (CSA), or any information that that is “exactly identical” to information that was previously removed or access to which was disabled. Content taken down through such tools would have to be flagged for users trying to access it later.
- Proportionate to free speech: The measures taken by intermediaries would have to be proportionate with regard to the right to free speech and expression and user privacy.
- Measures will need human oversight: The measures implemented by intermediaries will need to have human oversight , including a periodic review of any automated tools deployed as part of them.
- Automated tools should tackle issue of bias: The review of the automated tools would evaluate their accuracy and fairness, propensity for bias and discrimination, and their impact on privacy and security.
3. Disabling content within 36 hours of government order: All intermediaries have to remove or disable access to information less than 36 hours of getting a court order or from an appropriate government agency under Section 79 of the IT Act.
“[T]he intermediary shall remove or disable access to that information, as early as possible, but in no case later than thirty-six hours from receipt of the court order or on being notified by the Appropriate Government or its agency” — Rules 2021
- Providing information to govt within 72 hours: Additionally, intermediaries have to provide information for verification of identify, or assist any government agency for crime prevention and investigations no later than 72 hours of receiving a lawful order.
- Preservation of records of disabled content for 180 days: All intermediaries will have to maintain records of content that it has disabled access to or removed for 180 days for investigation purposes. This period can be increased by a court order or by a government agency authorised to do so.
Additionally, significant social media intermediaries will have to appoint a nodal person of contact for 24X7 coordination with law enforcement agencies and officers to ensure compliance to their orders. This person needs to be an employee and a resident of India. (Note: The previous version of the Rules needed them to also be Indian citizens.)
4. Voluntary takedowns: All intermediaries will have to take down content that violates any law; defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender; content related to money laundering or gambling; or “otherwise inconsistent with or contrary to the laws of India”.
5. Disabling content within 24 hours of user complaint: All intermediaries will have to take down content that exposes a person’s private parties (partial and full nudity); shows any sexual act; is impersonation; is morphed images within 24 hours of individuals (users or victims) reporting it.
An intermediary will have complied with Section 79 of the IT — or rather, not have violated any of its clauses — only if it is able to remove or disable access to content in all three of the ways mentioned in points 3, 4 and 5.
6. Putting back of content; grievance redressal mechanism: All intermediaries need to publish on their website/mobile app the name of grievance officer, to whom users or victims can make complaints against violations of the Rules. These complaints have to be acknowledged within 24 hours, and disposed off of within 15 days of receipt. They will also have to provide a mechanism to users to alert them on the complaints made by them.
Additionally, Significant social media intermediaries will have a mechanism that will allow complainants to track their complaints, each with unique ticket numbers. Additional compliance on significant social media intermediaries includes:
- Chief Compliance Officer: Significant social media intermediaries will have to appoint a Chief Compliance Officer who has to be a key managerial personnel or such other senior officer who is a resident in India. This person would be responsible for ensuring compliance of the IT Act and the Rules. (Note: Per the previous version of the Rules, this employee also had to be an Indian citizen.)
- Resident grievance officer: Significant social media intermediaries would have to appoint a Resident Grievance Officer who will have oversight over the dispute resolution mechanism. (Note: Per the previous version of the Rules, this employee also had to be an Indian citizen.)
- Appealing takedowns/putting back content: When content has been taken down or disabled, significant social media intermediaries need to provide a notice explaining the action to the user who “created, uploaded, shared, disseminated or modified” it. The user will be provided “adequate and reasonable” opportunity to dispute the action being taken by the intermediary, and request reinstatement of context. Additionally, users trying to access such information must be given notice on why it was disabled or removed. (Note: The previous version of the Rules required the notice to be provided only to the originator of the information.)
- The ministry can also call for additional information from significant social media intermediaries as it may consider necessary.
7. Transparency reports: Significant social media intermediaries are required to publish periodic compliance reports every month, with details of complaints received and action taken and “other relevant information”. These reports will also content the number of links or information that it has removed using proactive monitoring by automated tools. (Note: The previous version of the Rules required the reports to be published every six months.)
8. Voluntary verification of users: Significant social media intermediaries will be required to allow users in India to voluntarily verify their accounts using “any appropriate mechanism”. This could include the use of Indian mobile numbers of users. Verified users would be provided a “demonstrable and visible mark of verification” that is visible to all users of the service.
Also read:
- Summary: Intermediary Guidelines 2021 From An OTT Streaming Services Perspective
- Summary: Intermediary Guidelines 2021 From Digital News Perspective
Read the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
***Note (5:54 PM): Changed headline. Originally published at 5:34 PM, February 25.
***Correction (6:46 PM): Changed parts of post which implied that content takedown timelines only apply to social media intermediaries, and not intermediaries in general. Added about intermediaries having to provide information to verify identity.
