wordpress blog stats
Connect with us

Hi, what are you looking for?

Transcript and Video: MEITY’s Rakesh Maheshwari on IT Rules, 2021; traceability, intent, compliance timelines

On April 23, MediaNama held a discussion on the Impact of the IT Rules, 2021 on Intermediaries. Rakesh Maheshwari, Senior Director and Group Coordinator of Cyber Law & eSecurity at the Ministry of Electronics and Information Technology answered questions on the Rules during the session, on subjects like traceability, the chief compliance officer to be employed by social media intermediaries, and the intent of the law. We will be publishing a story on his session separately.

Below is the video and transcript of the closing remarks and the Q&A.



The following transcript has been lightly edited for clarity.

April 23, 2021

Advertisement. Scroll to continue reading.

Nikhil Pahwa, Editor, MediaNama: So, I think there are lots of issues and lots of points which have been raised. I was wondering if you would like to share your views and your comments.  I know we’re starting a little early.  But since we have you here, I thought we might want to explore your views on these rules that you’ve been so closely involved with architecting.

Rakesh Maheshwari, Senior Director and Group Coordinator of Cyber Law & eSecurity, Ministry of Electronics and Information Technology: Yeah, so first of all, good evening to all of you.  I was listening to some of the earlier views in the program.  But, thereafter, I was not into the conversation and therefore have lost the points.

I would like to say certain things. 

First of all, the intent of these rules from our side, had been that there should be a safe and secure ecosystem for users of various platform in the country; some higher level of accountability for all the intermediaries, and in particular about the significant social media platforms.  Let’s first of all understand some of the reasons which prompted us to make the changes the way they have been made.

Everybody knows the Shreya Singhal case. Then there was the Prajwala court case, wherein almost every tech company, the big platforms were party to it; the government was party to it; in certain decisions which happened in the court, the advocates from both the sides— so there was practically, a complete unanimity in all the decisions that happened.  And the clear recommendation in the Prajwala case was that you need to curb — in fact, the court has used a very harsh word: eliminate — child sexual abuse, rape, and gang rape imagery.

The very question from where it started was: is it really possible, that something which has been detected to be unlawful on the internet, on a particular platform— Is it really possible that the same thing is not reloaded? The question was prevention. But where we agreed was that it should be possible not to allow uploading of the same digitally identical content on the same platform.  And that’s where it was agreed that maybe something on such issues needs to be done. 

Advertisement. Scroll to continue reading.

Then, of course, there was also the Rajya Sabha ad hoc committee on pornography, there were certain recommendations which were made.  Very recently, there were a few court cases, like X v. Union of India.  And once again, the courts — in fact one of the decisions has come just about three to four days back — were looking for how the larger solution can be achieved.

So these were some of the cases. The draft [Intermediary Rules] were released in December 2018.  There were subsequent comments which we received over a period of time, and a lot of discussions which have happened throughout this period.  Maybe at times we saw that the rules are getting stuck, we are not really able to move forward.  But then, we also felt the need to have some sort of rules for the just-declared AOBR (Allocation of Business Rules) for the Ministry of Information & Broadcasting.  And that is where because of the requirement for rules to be published for them, the fact that our rules were already waiting in the pipeline [moved things along].

We thought that in the given scenario, given the limitations of the act that we have as of now, the best route will be to have this combined rule covering intermediaries as well as publishers, the VOD platforms and the online news platforms. This rule has been published.  From the intermediary perspective, there were one or two more cases, in fact, wherein particularly the state statements were that the platforms have been deleting or have been suspending the accounts of the people, on their own, maybe with their knowledge or without their knowledge. 

As the cases came along in court, people were claiming that [suspension] was done without their knowledge, we don’t really know.  Then, we realized that, why not, in certain cases, that the option, at least some sort of an advanced warning be given by the platform to the people, to the users before their accounts are suspended for maybe violation of the platform’s policy.  So, it was all those reasons, which have culminated into the present version of the rules.

Now, beyond this, I also completely agree that there is no one stop solution; the way the intermediaries have grown, the kind of complexities which have come up over a period of time, with the constraint that the IT Act has not been able to keep up pace with the changing requirements; that three tier structure has been created, as of now.

Now, as far as those rules which are meant for all the intermediaries are concerned, there is practically no major change except for the fact that some terms and conditions for the users, which are more of guidance to the users from the platform side, are conveyed. That is where we have emphatically conveyed that fake news is something which needs to be addressed.  Of course certain record retention period has also been increased, whether the user deletes its account, or the platform deletes the account on its own, or the platform removes a particular account, at the instance of the court or the government; in all these three cases, the requirement of keeping certain information about the users has been increased from 90 days to 180 days.

Advertisement. Scroll to continue reading.

And, this particularly was felt in those particular cases where the user himself, removed it, or the platform removed the account.  And, maybe subsequent investigations required the data to be there and it was not available.  So per se, because of the storage not being too costly, and the kind of information which has been otherwise available with the platform, if the user continues to be with the platform this information otherwise also remains with the platform. 

So, this is where it has been increased. Simultaneously we have wanted that the user should be made aware of the terms and conditions in whichever way the platform wants to convey it.  Maybe in our December 2018 draft, we had mentioned that it could be done periodically on a monthly basis.  But we realized that given the number of apps that anybody will be using, the number of intermediary platforms that anybody will be interacting with, this will become onerous, and practically people will tend to just ignore such kind of messages.

We have decreased the frequency to once in 12 months.  Rest most of the things are more or less on the same lines as were there in 2011 rules.  Now coming to the significant social media, so first of all I’ll talk about the social media itself. Yes, it’s true, I mean because that’s when, I was party to some of the discussions which took place.

It’s true that in the present version, the way the definition of social media has come out can be quite broad and quite misunderstood.  And for that reason— see, our aim is not to have more and more platforms being declared as the significant social media platforms. The whole intent is that wherever the risk is, and whenever the platform is reasonably large enough, that those platforms should be considered as significant.  And to that extent, we have already started working on maybe putting up a clarification as to what constitutes a social media platform.  And because maybe such some definition has already gone into the PDP bill, let’s hope the PDP bill becomes a reality in the coming Monsoon Session of Parliament.

The definition of social media for that reason, will come out maybe with a better version, as and when the PDP bill becomes a reality.  But for the time being, as I mentioned, the intent is not to have more and more platforms.  The intent is to those platforms which clearly provide the possibility of higher and higher social interaction, which have a possibility of content going viral, which have a potential to cause damage, and which are otherwise not in a specific or niche domain, whether it is a business domain or whether it’s an within the organization, or maybe for a specific kind of an activity that you are carrying out; maybe such platforms should not be considered as social media platforms. So ideally, whichever leaked version was being talked of, is what we are looking at when we are talking of the definition of the social media platform.

Now having said so, see the problem as of today, which is being felt is that while we have the larger platforms available and providing services in the country,  legally they all absolve themselves by saying that the Indian counterparts are not the ones which are providing services in the country.  So we ended up talking to faceless entities which are providing services in India, but who do not directly understand the culture and the sensitivities of the Indian mindset.

Advertisement. Scroll to continue reading.

And for that reason only, the concept of the chief compliance officer, the nodal officer, and the resident grievance officer has been brought in picture.  Because there are a large number of cases also wherein when you were writing grievances or when you are making grievances to them, maybe not even acknowledgments were being made.  And then, because you are not really sure whether your grievance has reached there or not, maybe you are also writing to California or to Ireland by post that your grievance is, at least, reaching there and hopefully gets heard.

And for that reason, and in fact, in so many court cases the Indian entities stood, once again, apart, compared to whatever their US or European counterparts were. Therefore, we have asked them to have an office here in India; taxation is not our intent as far as Ministry of Electronics and IT is concerned.  But definitely accountability of these platforms to the Indian citizen, to the Indian user, and to the government should remain. Because it is the responsibility of the government to provide a safe, secure environment, to the citizen.

Now, going back with the Prajwala case again, and on the number of complaints that are continuously being received, we also went beyond the act to say that in certain scenarios of… we have, in fact avoided the word revenge porn, we have avoided the word victim, but we have conveyed that if an individual’s privacy is breached, and privacy of certain nature is breached, if the person gets impersonated, or if there are say deepfakes and morphed images of the person which are created, then we have, in fact, empowered those people to be able to directly reach out to the platforms — to any intermediary platform for that reason — and to see resolution of their problem.  Once again we are working on few SOPs so as to ensure that sometimes people may not be directly able to reach out.  Not everybody is well conversant with the act, with the internet. And not everybody will be a user of that particular platform.

So while you may not be the user, while you may not be conversant with it, how do we how do we still facilitate an easier reporting, and an earlier resolution. So we are trying to work out some three four options, wherein the platforms will also be given an option to— they are also being encouraged to come up with trusted flaggers, some of them already have.

So if a trusted flagger brings it to your knowledge, if the law enforcement brings it to your knowledge.  If the person directly approaches, or if the person approaches through some authorization mechanism, that the grievance in such cases, should be expeditiously heard and taken care of.  The same now applies that initially, say, as per 2011 rules, there was 36 hours of time for acknowledgement and 30 days for resolution.

So with the change of technologies, with the ability to, say for example give an auto acknowledgment, these timelines have also been revisited, and the timelines therefore have been changed to 24 hours and 15 days. Which does not mean that every case has to be done within 15 days, but definitely every platform must make an earnest effort.  And for some reason, if they are not really able to do it within the stipulated time, whether it is 24 hours or 15 days or maybe 36 hours or 72 hours, that’s how the various timelines have been defined in the rules — let there be an earnest, I should say attempt. Let there be an acknowledgment that because of all these reasons or maybe because of one or two of these reasons, exceptions can always be there. The rules unfortunately do not work the way the industry works.  For example, I mean, I have also worked in so many e-governance projects wherein we try to define the SLAs, that 90% of time, you should be able to solve this particular problem or the website should be able to come within X amount of time.

Advertisement. Scroll to continue reading.

So, so unfortunately, given the wide variety of intermediaries that we have, and, and to have, as simple as possible rules that we need to have, maybe the expectations have been set. Thereafter it is always on a one to one basis, on a case-to-case basis, that the actual timelines can be decided. But if there is an acknowledgement, if the reasons for delay, if any, becomes clear, I’m sure it’s more important that the due diligence is done, rather than we just stick on to whatever timelines have been defined.

However, simultaneously, I will also like to mention that there is, if there is a systemic failure; if you don’t really define your, if you don’t tune your system… If you don’t increase maybe in case, in some cases the moderators. Then of course as a system you will not be able to adhere to those timelines, which is what exposes you to losing the intermediary exemptions that have been given under the law.

So, while the law was very clear right from day one, that the intermediaries are being exempted, subject to certain due diligence, now, these due diligences, now as you do not follow these due diligence. Earlier also, doubts have been raised that there are criminal provisions, unfortunately it is because the act, already had criminal provisions. So if an extant law is being violated, you become vulnerable to that, but once again, that’s not our intent.

Our intent is compliance. Our intent is that in a one-off case if some things go wrong, nobody should be chased. And therefore, particularly say for example with respect to the chief compliance officer, we have very categorically conveyed that an opportunity has to be given before anything is acted upon. So I would like to use this platform to convey that it is not the aim of the government to chase the chief compliance officer, or for that reason the intermediary, and then for every possible scenario maybe look forward and communicate that maybe you have lost your exemptions.

We are only looking forward when there is clear, willful defiance, systemic failures, and that too, we will be trying to address as much as possible.  At least that’s our aim through the mechanisms that are being established, through the mechanism of chief compliance officer, and nodal officers that are being requested in certain, particularly larger platforms. 

It is also possible that sometimes through intermediaries, which may not be falling into the categories of social media, significant social media platforms, but have significant risks. The government once again has the ability to maybe relook at the risk level — there can be particularly hyper local intermediaries, which can, if they become, if they are becoming a risk, then only the rules provide you to have an option to declare them also as significant, and then due diligence as is expected from a significant social media platforms also becomes applicable to them.

Advertisement. Scroll to continue reading.

Nikhil Pahwa: Will this SOP that you’re working on be opened up for consultation, will it be notified in the gazette?

Rakesh Maheshwari): SOPs will not be notified in the gazette. SOPs are only meant to support the rules, and in no way replace the rules themselves. However, SOP set the expectations that the intermediary platforms can have, and the government can have, and the citizen can have. Once again, the rules are setting certain expectations, or they are setting certain principles, the way they need to be implemented.

Nikhil Pahwa:  What kind of legal backing will the SOPs have? How can they be enforced if they are not notified?

Rakesh Maheshwari: For the proper implementation of the rules, the intent of the rules getting met, believe it is highly desirable, that in some cases, we come out with standard operating procedures, and standard operating procedures is what we have for example with— there are a few people we interact regularly with in our [Section] 69A [of the IT Act] processes. So while there are clearly laid down rules, the fact is that over a period of time, the day the rules were written, the way the rules are getting operated, there is a need for a clear understanding of these rules between the various stakeholders.  And that is where, on a consultative basis, we developed the SOPs and the same should be true this time as well.

Nikhil Pahwa: But sir, it’s not consultative then because it’s not putting it up for consultation.

Rakesh Maheshwari: It does not mean that everything has to be put on the website then only it should be called as consultative. So, those who are the stakeholders will definitely be consulted before it is done. And then there is nothing which will remain in hidden space, there are certain limitations which we have in 69A process, but there may not be any such limitations as far as these SOPs or FAQs may be concerned. If, as and when these SOPs or FAQ will be developed, they will once again be available in public space, as well.

Advertisement. Scroll to continue reading.

Nikhil Pahwa: Fair enough sir. For me, the most important stakeholders for any government are the citizens of the country. And if they’re not involved in this process, then I guess you’re then giving more importance to—

Rakesh Maheshwari: No, no, for that reason, draft rules were already published.  Comments were also published, counter comments were also published.  So, it’s not that we did not do any consultation, but that does not mean that— see, we have certain timelines also to meet.  So as soon as we go in for a full steam consultation, the fact will remain, and then they will never get it. This is what democracy in India is.

Nikhil Pahwa:  One concern raised by panelists in this discussion has been that the requirement for identifying the originator would require re-architecting of global platforms.  And so the three months time frame was a little too short. Is there any scope for expanding the timeframe for that?

Rakesh Maheshwari: So do you agree that it’s doable, first of all, do you agree that technically it’s doable?

Nikhil Pahwa:  So actually sir, pretty much every technical stakeholder that we’ve spoken with have said that it’s not doable within the bounds of end-to-end encryption. And they’ve asked for a technology-specific consultation, and at least a year.  We have Debayan with us on this call. Debayan do you want to share your views on this?

Debayan Gupta, Department of Electrical Engineering & Computer Science, Massachusetts Institute of Technology:  Nice to meet you, Mr. Maheshwari. As I was saying before with the airplane example, I think the government probably has good reasons for doing what it is doing, and you summarized them very nicely. The problem is when one puts forth a possible implementation like hashing, or a requirement like saying we want to trace the first originator. The government can’t possibly be an expert with every single platform which are using all sorts of different technologies.  

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: Yeah.

Debayan Gupta: But once these requirements are put in place, because these are global systems, and these changes will cost them, make no mistake, huge amounts of money, right?

And they’ll be hiring people like me, so I’ll be making money off of this. I’m quite happy, in a monetary sense, but I’m very unhappy in a technical sense. Because, first, when such requirements are put forth, there needs to be a lot of consultation about being very, very, very precise about exactly what is required, I’m talking mathematical theorems-level precise, because at the end of the day, that’s what’s happening on a computer.  

Rakesh Maheshwari: Yeah.

Debayan Gupta: And when you do not have that level of precision, when you do not have those discussions before the companies are asked to follow the rules, suddenly what happens is that companies don’t know what to do.

In fact, some time ago when Nikhil —  I’ll give this to you you word for word— he asked me do you think this can be done in three months? And I gave him the actual answer: which is I don’t know. Which in my opinion is a worse answer than no, it can’t be done in three months. If the question is precise enough, then I can put yes or no, then I can put a timeline on it then it’s a different question.

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: Debayan-ji, first of all, good to listen to you.  But let me also tell that it’s almost more than two years after the draft rules were made public that we are actually coming up with the final rules. Right from day one, at least one of the platforms, which everybody has been talking about, this requirement was put forth by the government right from day one.  So, it’s not that, it is for them to make it now.

I have two simple points, I mean it’s not that, if it is not technically doable, then it can’t be done also. But our understanding is clear, we were clear right from day one, that the electronic message will have to be provided by the government. So that means, and, therefore, we are not at all looking at the way the encryption has been done, the way decryption is being done.  We are not at all looking at it, we are only looking that at the end user device, the message does remain unencrypted. And if it is simply being forwarded, then before it is being forwarded, it is the same message, and hence the hash should remain the same. Now, how exactly it is to be done, which technical architecture is to be deployed, it is best for the platform [to decide].

From a legal perspective, the 2011 rules also mentioned, and now also it is mentioned that there are certain expectations from the users, the user shall not be involved in doing A, B, C, D, X, Y, Z activities.  Now platforms cannot take the simply the shelter of being an end-to-end encrypted platform, thereby saying, that while I put these terms and conditions for the users, however, I completely remain blind to what happens on my platform. The platform therefore has the responsibility that while you continue to provide the privacy, work out certain mechanism.

And whatever we have done, whatever we were asking since the beginning, was also based on certain inputs made available to us. We may be completely wrong. So there are certain assumptions with which we have gotten ahead.  There were certain technical inputs which were made available to us, which we also shared with the platforms.

Once again, not that, that this is the solution, but that this is also one of the possible solution.  It is how up to the platform, one.  Two, of course to prevent any misuse, lots of checks and balances, have already been made, particularly on this part of the sub rule. Lots of checks and balances have been made, so I do not foresee— I mean, if technically feasible, per se, the requirement I hope, cannot be misused.  I mean, given the checks and balances that we have created.

Lastly, you also mentioned that if there is a better means of doing it, If there is a less intrusive way of doing it, as far as breach of privacy is concerned, then that lesser means will be utilized.  So it is for the platforms to come up with what they can offer to the government for the particular condition that we do want. What was our intent? Our intent is that if there is a trouble being created in the system, the system cannot just take the shelter of it being end-to-end encrypted and therefore be completely unaware and hence completely escapes out of the problem without any anything being done by the platform.

Advertisement. Scroll to continue reading.

So we want platforms to be accountable, we want people to also be accountable if they are spreading, particularly some such kind of messages, which are against the sovereignty and integrity [of India] and such kind of things, or maybe which are completely denounced across the world.

Debayan Gupta: So one quick clarification, when you’re saying that about the hashing thing, are you assuming that I’m using WhatsApp’s official messaging platform? Because one of the biggest problems that we’ve been dealing with, whenever we technical experts have been trying to deal with this stuff, is there is no reason, suppose I’m using messaging system X, that I use their official app. Especially bad parties or criminal parties, and half my students, can build alternative apps that let’s say to Facebook or Google or whatever appear to be their apps.  But now I can do whatever I want, I can get a message, attach a different hash to it and forward it, right?  So, this opens this up suddenly to this huge—

Rakesh Maheshwari: Yeah, yeah. I completely agree, I completely agree.  The rules, once again, cannot be, I mean, particularly the kind of rules that we will always have on the internet, will never be, will never provide a 100% solution to a given problem.  Okay, so we are trying to address…

Nikhil Pahwa:  You have got a liability on not providing a solution to the problem.

Rakesh Maheshwari: No, no, no, no, no, no.  Nikhil-ji, you will have to understand the kind of options, and the kind of technological alternatives that we have to a given problem. There will always be more intelligent people, there will always be more intelligent people, there will always be people who will be able to bypass the system.

Rules cannot particularly cannot be made for those extreme cases. Rules are to be made that, in general, they should be able to suffice.  On the internet, anything can be bypassed, whether we block, whether we ask to remove a particular thing.  So, whichever way I mean you are looking at it, there can always be a bypass available.

Advertisement. Scroll to continue reading.

We are not forgetting about those exceptions. The rule should by and large be able to meet the expectations of the government, as well as, I hope, the users.  In fact, the platforms with whom I have been interacting, they are mostly willing to cooperate and work with us. Once again, I believe it’s therefore doable, and everybody is willing to cooperate. Yes, if three months of time, turns out to be short for a particular sub-rule, or a particular clause of a particular sub-rule, then government will always remain open to those aberrations, and be practical about the time it takes to implement a particular process wise solution or a technical solution or a manpower solution, whichever way it turns out.

Nikhil Pahwa: Sir, there’s a question from Rathima in the comments: is sub rule 3(2)(b) only intended to cover revenge porn, sexual imagery, or are any deep fakes, morphed images separately covered as well? Can they be extended to cover other doctored speech and things like that also?

Rakesh Maheshwari: Now if there are two scenarios; one, something which bothers me a lot, because it is about me.  And therefore we have tried to cover those scenarios, wherein we have therefore given, at least from the rule perspective, a direct access; otherwise intermediaries are supposed to act only when the appropriate government or its agency or the court says that for certain unlawful material— otherwise, intermediaries have a reasonably, I should say, good amount of freedom available to them, earlier also, now also. Except for such a specific content, which we are trying to empower the user [to get removed].

Nikhil Pahwa: So doesn’t this, in effect, also party subvert Shreya Singhal on section 79A?

Rakesh Maheshwari: Yes sir.

Nikhil Pahwa: Doesn’t it subvert?

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: Yes sir, yes sir, I agree, I agree.

Nikhil Pahwa: So then that’s legally going against the Supreme Court.

Rakesh Maheshwari: No, no, no, sir, no sir. Once again, what we miss is that Shreya Singhal is with respect to 79(3)(b), as far as section 79 is concerned.  However, the government continues to have, and will have the ability to draw the due diligence, as is expected at 79(2)(c). Now this due diligence is a moving target. What is due diligence in 2011, need not be the same due diligence in 2021, and in future also this due diligence may change. Now this due diligence is what we are trying to use to remain aware of the current problems, and try to address them as much as possible.

Nikhil Pahwa: There’s a question from Harshata, and we’ve been asked this multiple times.  What is the logic of 50 lakhs, or 5 million, when we have almost 500 million social media users?

Rakesh Maheshwari:  So I will say, we have gone by a gut feeling. It is a number which you had also mentioned based on the German act, maybe wherein the number is 2 million, we have fixed it at 5 million, and that’s how it has been fixed through a different notification. So that if, at a later time, the need is felt, as  I said I mean, once again, what was the aim? If we defeat our purpose, then no point.  So we remain open. Today, this is the number which we have fixed as a threshold. If in future if it is felt, it has to be decreased or increased, it will be accordingly done without changing the rules.

Nikhil Pahwa: Sir, again, another confusion was related to this.  Is there a regulatory process for determining whether an entity has more than 50 lakh users?

Advertisement. Scroll to continue reading.

Rakesh Maheshwari:  No, there is no, there’s no, there is no— so the whole, the whole IT act, the whole IT [Rules], is based on, it’s a complaint driven model…

Nikhil Pahwa:  No sir, so what is, how do you define 5 million users here, for something to be expected to be treated as a significant social media intermediary. Is it 5 million registered users since inception, is it 5 million monthly active users, which is a moving target, and we have raised this issue even on that January 4 meeting in 2019 about this definition of 5 million. 

Rakesh Maheshwari: Okay, so first of all, what we have mentioned today is that it is the 5 million registered users, which will be counted to determine the threshold.  Now, whether you may be inactive, you may be active users. It is for the platform, maybe to come up with their policy as to what do they do with the inactive users. Active users in any case, everybody will count them so it’s not a problem.  It is a problem only with respect to when you have a good amount of inactive users with them. So you now need to take a call whether you will like to continue to have them in the platform, or maybe you put them out of your active numbers, and then if at all they need to come up, maybe they need to be verified or… up to you to define your policies.

So some bit of policy which needs to be made clear as to what exactly do you mean by maybe an active user, or maybe a user on the platform, and beyond what time.  Maybe you will be taken off the platform or the intermediary has the ability to take it, take them out of their platforms. We are not specifically using the term weekly active user, or daily active user or monthly active users.  Because this is something which is completely known and can be varying from a month to month basis for a given interval.  That’s our business.

Nikhil Pahwa: So that means that anyone with more than 5 million users has to disclose that 5 million registered users will have to come to MEITY and disclose…

Rakesh Maheshwari: No, we have not asked this way. But ministry has the right to ask to a particular platform if the activities of that platform fulfill the criteria of a social media platform, as to whether they have their number of users, above threshold, above this threshold or below this threshold.  We are not even interested in maybe knowing the exact numbers.

Advertisement. Scroll to continue reading.

However if the need be, ministry can always ask, and that’s also one of the rules which says that maybe if the need arises we can, the ministry has the right to ask. But of its, own we are not registering them, we are not asking all the intermediaries or all the social media intermediaries to disclose, no, we are not.

Nikhil Pahwa: There’s a question from Peter Simon from GitHub, on this.  Because there’s a they’re a technology platform for technologists. He’s asking, why was the exemption for business oriented platforms, networking intermediaries, search engines, encyclopedias, directories, emails and online storage which was in the leaked draft, dropped from the notified text?

Rakesh Maheshwari: Okay, so this question I will not be able to answer as of now. But yes it has been dropped. But, once again I will only like to mention as of now, that the intent of the definition of social media remains as it was earlier also. So maybe this part therefore we’ll try to clarify, through the FAQ, as and when it gets published.

So, I would like to therefore assure that the definition which was there. In fact, I will say not the leaked version. The definition which is presently there in the PDP bill, is what the intent of a social media platform is.

Nikhil Pahwa: There’s a question from Deepanshu Maheshwari, did the government do any cost benefit analysis, given the huge potential cost of traceability, as suggested by cryptography experts?  What does the government stand to gain from this rigid insistence?

Rakesh Maheshwari: No, no, once again, see it is not for the government to do the cost benefit analysis. Government has to protect its citizens, if there are lynching incidents happening, government has a right to know if there is any root cause available. It is for the platform to do the cost benefit analysis.  We were engaged with at least as I mentioned, with one of the popular platforms right from beginning, and we were always trying to find out possible means and ways of getting it done. But beyond that, I will again like to mention that the platforms cannot just take the shelter of the so-called encryption, we are in any case, not at all breaking it, or asking them to break the encryption.  We are giving them the message in clear, the responsibility of the platform remains, it is up to them to find out means and ways of how it has to be done.

Advertisement. Scroll to continue reading.

Nikhil Pahwa: Sir there’s an issue that’s been raised in our discussions about the criminal liability for the chief compliance officer.  Why does it need to be a criminal liability for an individual?

Rakesh Maheshwari: So, first of all, first of all, once again, let me again convey that it is not the aim of these rules to have a criminal liability fixed for the chief compliance officer.  The chief compliance officer has to ensure compliance to the lawful orders, the lawful requests, the lawful notifications made by the government, and to resolve any systemic failures if they are happening, or any misunderstanding which crop up time to time. This is how we are envisaging the role of chief compliance officer.

However, at the level of the Act, it clearly says that the intermediaries are exempted, and the intermediaries exemptions go in certain cases. So it is at the Act level and not at the rules level that the liability has become criminal or liability has become civil.  So liability is already known, known right from the day the law act was last amended, the day the Shreya Singhal judgment was conveyed.  So the liability has remained, but the aim of CCO is not to hold and howl him, but to ensure compliance of the lawful requirement of the government, and to resolve the issues. Because the majority of the time, there is a communication gap.

Nikhil Pahwa:  That’s a different thing for the company and a different thing for the employee. The issue is about criminal liability for that employee potentially, which is a very different risk.

Rakesh Maheshwari: Anyway, I have conveyed my views. 

Nikhil Pahwa: If the government is publishing FAQs like you said, can we send questions, can we compile questions and sent for you to consider?

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: I’ll be happy to help.

Nikhil Pahwa:  So we are happy to compile the questions. So one other question was about, you know, the reason for more time given to social media intermediaries to comply within three months. Why was there no time for compliance given there?

Rakesh Maheshwari: To whom?

Nikhil Pahwa: Sir, to news entities.

Rakesh Maheshwari: To other intermediaries

Nikhil Pahwa: Not just intermediaries or even outside the ambit of intermediaries.  To OTT streaming— 

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: So on the MIB part, in fact, it will be MIB who will be in the best position to answer. But as it regards to the intermediary part, substantial change in our intermediary rule is the storage period. Which, which, in case you are storing for 90 days, you can as well store up to 180 days.  And that does not require— you already are supposed to store it for the next 90 days, now onwards automatically it practically means that you have 90 more days to comply with it.  By that time, you have to have a mechanism to be able to store your messages for 90 more days, or not messages, not messages the identity the registration details or whatever logs you are collecting.

Beyond that certain changes of course. Like acknowledgement, grievance, acknowledgement, grounds redressal, maybe 3(2)(d), the reporting of revenge porn and such other obscene material.  While the rules have immediately come into existence, we also know that practically, the way people are able to come up with their requirements, and the platforms will also be able to meet those kind of requirements. And they should also be ramped up, so it’s not that all of a sudden you will have a huge demand coming. So it should be doable. 

So at any given time, it is once again the intent, because I did mention— the variety of intermediary platforms that we have, you fix whatever time you want to, there will always be a class of intermediary for which it turns out to be too less or too much.  So there will always be issues. Courts had always been practical, courts will continue to be practical, government will also have to be pragmatic. So, it is expected that you ramp up in certain areas, your capacity as soon as possible, for significant social media intermediaries, we have already given three months, to be able to take care of certain manpower and procedural requirements, and maybe in some cases some technical requirements.

Nikhil Pahwa:  Sir, a question is about the chief compliance officer again, will they go to jail for not deleting something which is legitimate or lawful under rule 3(1)(d)?  How will there be a check on frivolous 79(3)(b) requests?

Rakesh Maheshwari: So that’s what I mentioned, chief compliance officer’s role is to act as a bridge between the intermediary, and the needs and the lawful requests of the government to address any communication gaps, which do arise.

Nikhil Pahwa:  In that case so sir, millions of users, isn’t the 15 day timeline, very short?  I mean, automated responses within 24 hours one can understand.  But a 15 day timeline for an entity let’s say that has let’s say 300 million users.  

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: It doesn’t matter, 300 million users anybody if has got will also have wherewithals. So in fact it can be a problem for a smaller platform rather than a bigger platform. So, you will always— question is simple. You know, how many complaints you are likely to receive, you have received it in the past. So statistical theory, always tells you, so it’s a plain science, you need to have enough of capacity to be able to address, whether you are a small platform or a large platform, it doesn’t really matter.

Nikhil Pahwa: Sir, there’s a question that Malavika Raghavan had sent in, asking, she was very curious about how MEITY and Ministry of I&B came together to release these rules.  How did the two entities come together to release these rules?

Rakesh Maheshwari: Can’t government work together? [laughs]  Because two good people in two different ministries came together to be able to develop this new rule jointly.

Nikhil Pahwa: Sir, in the Twitter case, a few months ago, there was a disagreement.  And what Twitter had argued was that some of the requests to remove content were unlawful from the government or the fact that the content that was there was lawful.  So what is the what if, in this case how are the rights of the citizen protected?  If the content is lawful, and the government asks to remove it under 79(3)(b), what if the intermediary doesn’t remove? Then the CCO might go to jail?

Rakesh Maheshwari: Once again jail… So first of all there is a difference between a 69A order and a 79— 

Nikhil Pahwa: Well sir, transparency is one big difference.

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: Yeah transparency is one big difference— [actually,] no, transparency is there. I mean, let me tell you, even on 69A. 

Nikhil Pahwa: There’s no transparency…

Rakesh Maheshwari: No, no, no, no. I respect your views, I respect your views. I don’t disagree with your views at all.  But for me, we always informed the concerned platform, well in advance, in fact, the law’s requirement is 48 hours, and we have instances where we have informed them in 46 hours in advance, and the platform has come back to us. 

So that’s the kind of transparency with which we work, the platforms are always part of the discussions, when the decisions are taken, the recommendations are made, except in cases of emergency, except in cases of emergency. However, in such cases of emergency, the platform in any case has to be informed of the orders, has to be part of the meeting to be convened within 48 hours of the orders.  And, it’s not that if an emergency order has been issued that the orders cannot be reversed, one, two.

There are sometimes certain times sensitivity in certain matters, so there is a time context based.  And we have therefore gone to those extents, wherein if we believe after 15 days, this right now being something which is injurious to the nation, or which is likely to create a public order, it may not be causing a public order after 15 days. We have always agreed as part of the 69A committee to review this decision after 15 days. We saw after 15 days we once again take a call, and we have reversed our own orders.

Nikhil Pahwa: But sir, there’s no such oversight in case of 3(1)(d) or say…

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: No, no, no, there is no such oversight, and therefore, there is no such compulsion also by the intermediary, to be adhering to— because I agree I mean that there is, there is no such, I should say check and balance available in the Section 79 process, so intermediary rules process.  But for that reason, the intermediaries are also at liberty not to act, or to give their justification of why they are not acting, of course exposing themselves also.

Nikhil Pahwa: There’s an anonymous attendee who’s asking, what happens when orders under the IT Act under Section 69 are in conflict with the social media intermediaries obligations in other countries.  For instance, in case of a request for user data under 69A, where users from across geographies are concerned. Social media intermediaries may have to violate laws in other jurisdictions to comply with the request, and if they don’t comply you use safe harbor in India. How can platforms reconcile this in the global order?

Rakesh Maheshwari: Yeah, so first of all, let’s be very clear as to what is the scope of IT Act. So the scope of IT Act is with respect to Indians. Now whether you are impacted, because of a computer resource, acting from outside India or from within India.  Or whether you are impacting somebody else’s computer from India, so first of all, the scope of the Act has to be very clear.

So when we are seeking information, we are seeking, or we will be provided information with respect to an Indian user.  If the user happens to be outside India, then the platform will have to follow wherever or whichever country’s data, maybe you happen to be asking. Because you don’t know who where the user is, that’s the beauty of the whole internet or anonymity on the internet.  So the intermediary therefore, will refuse providing certain registration information, which does not fall within the jurisdiction of the country, and they ask for MLAT in those particular cases. 

Two, when you are asking for blocking or removal, your scope is once again limited to your country, our country. So, whether the platform decides to remove it globally, or the platform decides to remove it within India, our requirement is, you remove it within India.  Rest is the platform’s choice whether its platform policy or policy violation, platform will remove it globally.  If it is only Indian governments or Indian laws’ violation the platform will remove it within India and we are okay with it.

Aditya Chunduru, Journalist, MediaNama:  If the government is okay with tracing being bypassable. As in, you could use fake versions of an app to bypass traceability.  Then what happens when a court case finds that finds an originator to be important, if there exists a method to bypass then it becomes difficult to prove anything.  Basically there’s too much scope for like doubt. Does this really solve the problem that it intends to?

Advertisement. Scroll to continue reading.

Rakesh Maheshwari: Dekhiye, bahut baar… first of all, what is our intent?  What are our checks and balances that he has made into the systems, so that this intent is per se not misused.  Or the or the facility which is being provided is not misused, beyond this, whether it actually remains useful or not.  In Internet, what you define today, the situations may change very fast. This is the intent with which we have gone. In our rules we have conveyed this is what we will be sharing, this is what we will be expecting, this is when we will be invoking, and if there is an alternate means, we will be invoking the alternate means. Beyond this, in fact I’m also not really able to understand your question and probably will not be able to respond, maybe to the extent you are looking for.  Our intent was this.

MediaNama’s discussion on the IT Rules, 2021 was supported by Google.

Also read

Written By

I cover the digital content ecosystem and telecom for MediaNama.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.


The accession to the Convention brings many advantages, but it could complicate the Brazilian stance at the BRICS and UN levels.


In light of the state's emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?


The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.


The latest draft is also problematic for companies or service providers that have nothing to with children's data.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ