Google’s chief privacy officer Keith Enright on Monday posted Google’s guidelines (pdf) for regulators to enact data protection rules. The post came a week before India’s IT ministry ends its public consultation for the Srikrishna committee’s data protection bill. “This framework helps Google evaluate legal proposals and advocate for smart, interoperable, and adaptable data protection regulations,” the document says. The document is similar to Access Now’s lengthier do’s and don’ts briefer (pdf) for lawmakers.
Here’s how Google’s recommendations match up to the Srikrishna committee’s bill.
Data collection and transparency
Google’s principles
Collect and use personal information responsibly.
Organizations must operate with respect for individuals’ interests when they process personal information. They must also take responsibility for using data in a way that provides value to individuals and society and minimizes the risk of harm based on the use of personal information (i.e., data that can be linked to a person or personal device).
Mandate transparency and help individuals be informed.
Organizations must be transparent about the types of personal information they collect, why
they collect it, and how they use or disclose it, particularly when used to make decisions about the individual. Regulators should encourage organizations to actively inform individuals about data use in the context of the services themselves, helping to make the information relevant and actionable for individuals.
The Data Protection Bill’s provisions
The Data Protection bill requires strict standards of consent for data collection.
But on transparency, the bill is less than forthcoming on data subjects’ rights. Users can’t access a copy of data stored about them, but are entitled to a summary of what data is stored, and how it is being processed, under Section 24(1).
Purpose limitation and data correction
Google’s principles
Place reasonable limitations on the manner and means of collecting, using, and disclosing personal information.
Collection and use of personal information can create beneficial and innovative services, within a framework of appropriate limits to the collection, use, and disclosure of personal information to ensure processing occurs in a manner compatible with individuals’ interests and social benefits.
Maintain the quality of personal information.
Organizations should make reasonable efforts to keep personal information accurate, complete, and up-to-date to the extent relevant for the purposes for which it is maintained. Data access and correction tools, as mentioned below, can assist organizations in meeting this obligation.
Define personal information flexibly to ensure the proper incentives and handling.
The scope of legislation should be broad enough to cover all information used to identify a specific user or personal device over time and data connected to those identifiers, while encouraging the use of less-identifying and less risky data where suitable. The law should clarify whether and how each provision should apply, including whether it applies to aggregated information, de-identified information, pseudonymous information or identified information.
Give individuals the ability to access, correct, delete and download personal information about them.
Individuals must have access to personal information they have provided to an organization, and where practical, have that information corrected, deleted, and made available for export in a machine-readable format. This not only empowers individuals, it also keeps the market innovative, competitive, and open to new entrants.
The Data Protection Bill’s provisions
Under Section 5(2), “Personal data shall be processed only for purposes specified or for any other incidental purpose that the data principal would reasonably expect the personal data to be used for, having regard to the specified purposes, and the context and circumstances in which the personal data was collected.” (emphasis added)
Under Section 25, data subjects have the right to correct their data, and if data controllers have a problem with any updating requests, they must inform the subject in writing why with a reason. Personal data is very broadly defined in the bill, which is in opposition with Google’s principle. Here’s the definition: personal data is defined as any “data about or relating to a natural person who is directly or indirectly identifiable, having regard to any characteristic, trait, attribute or any other feature of the identity of such natural person, or any combination of such features, or any combination of such features with any other information”.
Right to object to processing and data security
Google’s principles
Make it practical for individuals to control the use of personal information.
Organizations must provide appropriate mechanisms for individual control, including the opportunity to object to data processing where feasible in the context of the service. This does not require a specific consent or toggle for every use of data; in many cases, the processing of personal information is necessary to simply operate a service. Similarly, requiring individuals to control every aspect of data processing can create a complex experience that diverts attention from the most important controls without corresponding benefits.
Include requirements to secure personal information.
Organizations must implement reasonable precautions to protect personal information from loss, misuse, unauthorized access, disclosure, modification, and destruction, and should expeditiously notify individuals of security breaches that create significant risk of harm. Baseline precautions should apply to any collection of personal information, and additional measures should account for and be proportionate to the risk of harm.
The Data Protection Bill’s provisions
The Data Protection Bill does not have a right to object to processing.
The bill implements data security requirements through audits conducted by mandatory data auditors and penalties on those who are not compliant with the standards set out by the Data Protection Authority of India.
Compliance and harm
Google’s principles
Hold organizations accountable for compliance.
Accountability can and should come in many forms. Lawmakers and regulators should set baseline requirements and enable flexibility in how to meet those requirements. Industry accountability programs and safe harbors can incentivize best practices, particularly in providing more flexible approaches to dealing with evolving technologies.
Focus on risk of harm to individuals and communities.
Regulators should encourage the design of products to avoid harm to individuals and communities. Enforcement and remedies should be proportional to the potential harms involved in the violation. Innovative uses of data shouldn’t be presumptively unlawful just because they are unprecedented, but organizations must account for and mitigate potential harms. This includes taking particular care with sensitive information that can pose a significant risk. To enable organizations to develop effective mitigations, regulators should be clear about what constitutes a harm.
Apply the rules to all organizations that process personal information.
Data is increasingly important through all sectors of the modern economy. Aside from the context of particular relationships that have existing rules, like with one’s employer or attorney, legislation should apply to all economic sectors and all types of organizations that process personal information. While certain sectors (e.g., healthcare) may have additional rules, regulation should set a baseline for all organizations. The application of the law should also take into account the resource constraints of different organizations, encouraging new entrants and diverse and innovative approaches to compliance.
The Data Protection Bill’s provisions
Here, Google seems to be asking for flexibility that the draft bill does not seem to be encouraging. There are stiff penalties for data leaks, and the penalty is decided by the Data Protection Authority of India — even criminal penalties may apply.
The Adjudicating Officer, who represents the DPAI, will determine whether harm was caused or if a data controller violated provisions. These conditions aren’t always met simultaneously — a data breach may not rise up to legal standards of harm, though it will most likely run afoul of the draft law. So this gives DPAI the flexibility in deciding penalties, as opposed to the data controller. The law also has an exemption on some data protection requirements for small entities, which are defined as those who don’t process more than hundred users’ data each day.
Enforcement and localisation
Google’s principles
Distinguish direct consumer services from enterprise services.
Much processing of personal information is done by one company on behalf of another, where the processor lacks legal authority to make independent decisions about how to use the data or operate outside the bounds of the client’s direction. Sometimes this distinction is described as “processors” versus “controllers”, allowing for the efficient use of vetted, qualified vendors with minimal additional compliance costs, which is particularly important for smaller entities. Processors can look to the controller to meet certain obligations under the law, including transparency, control, and access, but processors must still meet basic programmatic and security responsibilities.
Design regulations to improve the ecosystem and accommodate changes in technology and norms.
The technology involved in data processing is not static, and neither are the social norms about what is private and how data should be protected. A baseline law can provide clarity, while ongoing reviews (e.g., rulemakings, codes of conduct, administrative hearings) can provide more flexible and detailed guidance that can be updated without wholesale restructuring of the legal framework. Governments can support these goals by rewarding research, best practices, and open-source frameworks. Creating incentives for organizations to advance the state of the art in privacy protection promotes responsible data collection and use.
Apply geographic scope that accords with international norms.
Data protection law should hew to established principles of territoriality, regulating businesses to the extent they are actively doing business within the jurisdiction. Extra-territorial application unnecessarily hampers the growth of new businesses and creates conflicts of law between jurisdictions. In particular, small businesses shouldn’t have to worry about running afoul of foreign regulators merely because a few people from another country navigate to their website or use their service.
Encourage global interoperability.
Mechanisms allowing for cross-border data flows are critical to the modern economy. Organizations benefit from consistent compliance programs based on widely shared principles of data protection. Countries should adopt an integrated framework of privacy regulations, avoiding overlapping or inconsistent rules whenever possible. Regulators should avoid conflicting and unpredictable requirements, which lead to inefficiency and balkanization of services and create confusion in consumer expectations. In particular, geographic restrictions on data storage undermine security, service reliability, and business efficiency. Privacy regulation should support cross-border data transfer mechanisms, industry standards, and other cross-organization cooperation mechanisms that ensure protections follow the data, not national boundaries.
The Data Protection Bill’s provisions
The data protection bill does not draw a line between enterprise and consumer services. As such, penalties remain the same, especially since the definition of personal data is still broad. The principle on designing regulations that ‘improve the ecosystem and accommodate changes in technology and norms’ is something that the bill doesn’t seem to address; it’s something that the government might proactively attempt; rewarding research and open-source frameworks is something the government has a mixed record on.
As for encouraging global interoperability, this is where Google and the Indian government’s approach diverge most significantly. The draft bill as well as the leaked e-commerce policy draft recommend different degrees of localisation of personal data. At the very least, at least one copy of data needs to be stored in India according to the Data Protection Bill. Google CEO Sundar Pichai has personally written to the IT Ministry to encourage them to allow cross-border data flows.
I cover the digital content ecosystem and telecom for MediaNama.
