Submission to the Ministry of Electronics and Information Technology, Government of India, on the Draft Non-Personal Data Governance Framework

On 12 July 2020, an expert Committee set up by the federal Indian Ministry of Electronics & IT (“MEITY”) in 2019 released its recommendations on regulating Non-Personal Data (NPD). The Report by the Committee of Experts on Non-Personal Data Governance Framework proposes the creation of a separate “new national law” to govern NPD as well as the creation of a Non-Personal Data Authority (“NPDA”) which would have both an “enabling and enforcing role”. This Submission is in response to the nine-member Committee’s (“Committee”) call for feedback through a public consultation. The Submission was prepared by the Wikimedia/Yale Law School Initiative on Intermediaries and Information, an academic collaboration based at Yale Law School’s Information Society Project which aims to raise awareness of threats to an open internet, especially those affecting online intermediaries and their users, and to make creative policy suggestions that protect and promote internet-facilitated access to information.

Based on our analysis, the Committee has failed to carve out adequate safeguards against the potential abuse of citizens’ data by a range of stakeholders, posing a grave risk to human rights. Our submission concludes with a list of specific recommendations, for the Committee’s consideration.

2.     Background

The Committee’s rationale for its recommendations stems from its belief that India needs a modern non-personal data regulatory framework to ensure that the benefits accruing from the exploitation of such data are channeled to Indian communities and businesses. It defines NPD as data that is excluded from the ambit of personal data as defined by the draft Personal Data Protection Bill (“PDP”) or is without any Personally Identifiable Information (“PII”). It further divides NPD into Public, Private and Community NPD, mandates separate user consent prior to collection of their data for processing as NPD, and categorizes as Sensitive any NPD derived from sensitive personal information. The Committee also identifies certain key stakeholders within the NPD framework which includes Data Principals (to whom the data pertains), Data Custodians (entity collecting the data), Data Trusts (institutions entrusted with storing and sharing data), Data Trustees (who represent the interests of the data principals) and Data Businesses (any entity deriving revenue from the collection of user data). The Committee sets out a mechanism for mandatory sharing of some forms of NPD by companies, including for sovereign interest purposes, core public interest purposes and economic interest purposes and mandates that data belonging to Indian citizens and entities should be governed by Indian laws. Finally, it recommends the creation of a separate Non-Personal Data Authority (“NPDA”) which would review and adjudicate requests for data sharing that have been denied by data custodians/data businesses, and level the playing field for Indian tech businesses and startups.

While these recommendations are facing significant pushback from the private sector for the potential harm they pose to their business interests, they also need to be scrutinized from a human rights perspective. The Committee’s recommendations raise grave concerns in the context of international human rights norms on legitimacy, necessity, proportionality and non-discrimination. Under Article 17 of the International Covenant on Civil and Political Rights (“ICCPR”), to which India is a signatory, governments have a duty not to engage in interference inconsistent with the right to privacy. General Comment No. 16 of the Office of the United Nations High Commissioner for Human Rights (“OHCHR”) also mandates that “effective measures have to be taken by States to ensure that information concerning a person’s private life does not reach the hands of persons who are not authorized by law to receive, process and use it, and is never used for purposes incompatible with the Covenant.” Below, we examine how the process and recommendations fall short of these baseline safeguards, and offer some suggestions for improvement.

  1. Concerns with the Consultation Process

The process through which feedback is being sought lacks consistency, transparency and accessibility. It contravenes principles of the Right to Participation in Public Affairs, which are recognized as essential to democratic rule-making by Article 25 of the ICCPR read with General Comment No. 25, and Human Rights Council resolutions 24/8 of 26 September 2013 on equal political participation, 27/24 of 3 October 2014, and 30/9 of 1 October 2015 on equal participation in political and public affairs. These principles are further explained through the OHCHR’s Guidelines for States on the effective implementation of the right to participate in public affairs (“UN Guidelines”), which recommend that citizens “should be able to access adequate, accessible and necessary information as soon as it is known, to allow them to prepare to participate effectively, in accordance with the principle of maximum disclosure”. Moreover, they state that “rights holders who are directly or likely to be affected by, or who may have an interest in, a proposed project, plan, programme, law or policy should be identified and notified. Notification should be provided to all such rights holders in a timely, adequate and effective manner...When decisions have countrywide or very widespread impact, for example during constitution-making and reform processes, everyone should be identified as potentially affected”.

The draft regulations have not been made available on MEITY’s website, where citizens would reasonably expect them to be, considering the Committee was constituted by MEITY. Instead, they are hosted on, a citizen engagement platform, with no notification on the MEITY website directing the public to the portal, or to indicate that it is seeking comments on the Committee’s report there. Despite the active use of social media by MEITY as a tool for disseminating news and updates, at the time of submitting this report, we have been unable to find any posts from MEITY’s social media accounts informing the public that it is seeking feedback on a proposed NPD governance framework. This profound lack of public dissemination also contravenes the Indian Ministry of Justice’s executive circular on the Pre-legislative Consultation Policy (2014), which requires that any proposal which significantly impacts the rights of the public should be widely disseminated through print and electronic media, and in various languages, so its potential implications are understood by all those likely to be affected by it. 

The UN Guidelines also mandate that special consideration be given to ensure the participation of disadvantaged communities who would be impacted by the regulatory proposal, requiring that “formal participation structures should be accessible to and inclusive of individuals and groups that are marginalized or discriminated against, including those from disadvantaged socioeconomic backgrounds”. The regulatory proposal has only been shared in English; a language not easily understood by large segments of the population. Moreover, the only way citizens can share feedback is through the online portal - which effectively eliminates a large chunk of the population who do not have access to the internet. There is also no procedure for feedback to be given via postal mail, a medium that can ensure broader public participation at a time when the country is dealing with the COVID-19 pandemic. The lack of effort that has been put into actively soliciting citizen feedback on an issue which impacts their core human rights is troubling and undermines the rights of Indians to effective public participation. It also raises reasonable concerns that those who are most likely to be impacted by this regulatory framework will be the ones who are the least able to provide feedback.

  1. Failure to Protect the Rights of Data Principals

Among the most important shortcomings of the Report is a failure to adequately protect the rights of data principals. While the Committee recognizes the risks posed by de-anonymization of NPD and mandates consent of the data principal prior to its collection and use, the reliance on consent is problematic in the context of widespread recognition of the shortcomings of consent in the digital age. The shortcomings of a model based on consent are particularly apparent in the context of the diversity of different ways that anonymized data may be used (and misused).

The Committee provides that data trustees and data custodians could be tasked with providing consent on behalf of their communities, and suggests that there would be a form of “duty of care” that they would owe to the data principals. However, they fail to specify what such a duty of care would look like, in practical terms, leaving that to be defined by regulation. Given the challenges to consent on an individual basis, extending this principle out to communities, without clearly defining how such a fuzzy concept might operate in practice, is unlikely to provide meaningful protections against abuse.

The Committee also places too much faith in the ability of current techniques to fully anonymize personal data, despite recognizing that these methods still leave individuals at risk of being re-identified. By way of example, the GDPR defines anonymized data as “data rendered anonymous in such a way that the data subject is not or no longer identifiable” (see Recital 26). This definition emphasizes that truly anonymized data must be stripped of any identifiable information, making it practically impossible to derive insights on the individual it is derived from. Anonymized data is thus excluded from GDPR regulation altogether because it is no longer “personal data.” To achieve anonymization under GDPR, the entity that anonymized the data must be unable to re-identify the data, even when pairing it with other available information. This is an extremely difficult threshold to meet. The challenge of fully anonymizing personal data has also been noted by the Article 29 Data Protection Working Party of the EU in its 2014 Opinion on Anonymization Techniques.

Pseudonymization, on the other hand, replaces personal identifiers with non-identifying references or keys so that anyone working with the data is unable to identify the data subject without the key. However, at any point they can use “any reasonably available means” to re-identify the individuals to which the data refers, thus converting NPD back into personal data. The GDPR makes it clear that pseudonymized personal data remains personal data. The Committee should re-examine how its suggested framework can adequately safeguard data principals from the risks of re-identification inherent in NPD, which would essentially be pseudonymized rather than anonymized data. This becomes even more vital in light of its recommendation requiring open access to data by third parties.

  1. Lack of Rules around Government Access

Another shortcoming in the level of protection contemplated by the Report is that the government is granted virtually carte blanche to collect any data “for sovereign purposes”, including national security, legal purposes, etc. The Committee also includes data demanded by a “regulator to understand and keep abreast of developments in a sector with regard to need for regulatory interventions” in this category.

These standards are incredibly broad, and made more problematic by the fact that requests made under the color of “sovereign interest purposes” are difficult for non-governmental and business entities to refuse, in contrast to demands made for “economic interest purposes” or “core public interest purposes”, which allow data custodians and data trustees to refuse access if they believe the request to be excessive, or lacking integrity and good intent. In practice, this formulation places very few restrictions on the government’s ability to gather massive amounts of data about their people. While the Committee recognizes the need for checks and balances to prevent abuse, it omits any description of what such safeguards should look like, and does not even pay lip service to existing standards of legitimacy, necessity, proportionality and non-discrimination affirmed by the Supreme Court of India in the Aadhar Judgement or the recommendations made by the Shrikrishna Committee Report.

The Report also fails to properly distinguish the difference between sovereign interests and public interest purposes. In the absence of clarity over these terms, the NPDA has the potential to wield vast powers. This is particularly concerning given the history of terms like “sovereign interest” and “public interest”, which have a record of being used to justify abusive surveillance and other human rights violations, leading to a strong need for clear definitions.

One possible solution could be to eliminate the “sovereign purposes” category altogether, merging it into the “public interest purposes” category and clarifying the precise contours of its applicability. The Committee would also be prudent to keep in mind India’s legal obligations to respect its citizens’ rights to privacy and protection from unlawful or arbitrary surveillance under Article 12 of the Universal Declaration of Human Rights (UDHR) and Article 17 of the ICCPR.

There are also broader concerns around proportionality. No limits are prescribed on the categories of personal data that can and cannot be used to generate NPD or on the ways in which the NPD can be used by various stakeholders, thus treating all data (including sensitive data) and all business activities as fair game in the pursuit of “economic growth”.

  1. Conceptual Challenges with the Ownership Model for NPD

There is an inherent tension between the government’s proposed role as a guardian of the rights of communities (as a Community NPD trustee or as an “enforcer” through the NPDA) and its role in seeking to maximize economic benefits accruing from their data (as an “enabler” through the NPDA). While the Committee couches the government’s “enabler” role within a broader notion of ensuring “beneficial ownership/interest” for the community, it does not elaborate exactly how this benefit will accrue to the community.

The problem with this approach is rooted in the Report’s focus on the principle of data sovereignty as set forward in section 5.1 (i). There are two aspects to this. The first aspect is that the NPD derived from Indian citizens and Indian entities, as well as data collected in India, should have a defined ownership. The second aspect is that Indian law should apply to data collected in, or from India, or by Indian entities. While the second principle is a reflection of traditional understandings of national sovereignty and India’s ability to establish its own norms, the first principle is problematic insofar as it sets the stage for a regulatory framework modelled on notions of private property. The limitations of this approach are illustrated by section 5.1 (ii), where ownership for intangible assets has to be reformulated to mean “a set of primary economic and other statutory rights”, and the notion of “beneficial ownership/interest” is introduced.

Instead, a better model would understand NPD as a public good, which would only allow its private ownership under exceptional circumstances. This framework recognizes that NPD can be used simultaneously by more than one person without reducing its availability. It would also allow the economic and social benefits derived from the use of NPD to be better distributed among the Indian population, rather than concentrating them in the private sector. Under this model, non-profit agencies would have more opportunities to use data for the benefits of society, while Indian businesses would still have the ability to obtain economic value. More importantly, framing NPD as a public good reshapes the framework to prioritize the public interest in its use, leading to a system that is more respectful and protective of the rights of communities and individuals who will, directly or indirectly, be impacted by the processing of NPD.

  1. Improving Community Benefits

Based on the Report, it is not clear how the suggested data-sharing framework benefits communities, especially those who are marginalized and economically disadvantaged. Private actors are required to share anonymized metadata freely with the public through open-access and raw/factual community NPD at no cost, with all those who request it. If they moderately process the community NPD, they may still be mandated to share it on FRAND (fair, reasonable and non-discriminatory) based remuneration. With any subsequent “value addition”, they may be required to bring the data to a “well-regulated” data market to allow the price to be determined by market forces. The rationale for this is that because the data is generated by Indians, it should be made available, on a priority basis, to other Indians seeking to extract value from it. Although cloaked as being “in the larger public interest”, it is not clear how the communities, to whom the data belongs, would benefit from this mandatory handover.

The Committee is also unable to demonstrate how it will engage with various communities, get their informed buy-in, protect their ownership over their proprietary data, and prevent exploitation by third parties. Given that the Committee is justifying this vast exercise in data collection in the interests of economic growth, it is important to ask who this growth belongs to. The current recommendations seem to demonstrate a preference for advancing the interests of the business community, an intent further confirmed by the NPDA’s mandate of “unlocking value in Non-Personal Data”. This underscores an inherent discrimination in the application of this regulatory framework. It also raises concerns of legitimacy. While economic growth is a legitimate aim, it should not be sought at the expense of the rights of vulnerable communities.

The proposal seems mainly targeted to benefit the Indian business community, by giving them State-mandated access to community data. In fact, by making it accessible to private players with purely profit-motives, it actually poses a harm to the agency and ownership that communities have over their data, which can represent the proprietary knowledge, practices and traditions these communities have acquired over centuries and which serve as a source of revenue for them. In order for the “principle of duty of care” to have any meaning, it is important that the communities themselves get to decide who their data is made accessible to and how it is used. The standard-setting process should be grassroots led, in order to ensure that this agency is meaningful.  

  1. Ironing out the Details for Data Trusts

While data trusts are an important aspect of the data governance conversation, their effectiveness is still being studied. At their very core, and similar to a legal trust, they seek to create a fiduciary relationship between data principals and data trustees - with the latter looking after and making decisions about data in a similar way that trusts have been used to look after and make decisions about other forms of assets in the past. Legally speaking, a fiduciary duty is considered the highest level of obligation that one party can owe to another. In the context of data trusts, privacy law experts note that a fiduciary duty involves stewarding data with a high degree ofimpartiality, prudence, transparency and undivided loyalty.”

While data trusts can be useful in some scenarios, they can also be harmful in others. Regulatory authorities around the world are still debating their efficacy, and whether they can be effectively governed at the scale anticipated by the Committee. The Committee has not taken a position on resolving these challenges. It has also not provided a workable legal framework to ensure that data trusts will safeguard community data from exploitation by those with purely profit motives. Nor has it explained how data trusts will take decisions around data collection, sharing and use or how they will ensure accountability, democracy and transparency in decision-making. The Committee has also failed to develop rules to safeguard independence among data trustees, or to assess conflicts of interest. Finally, the Committee has not examined how data principals can challenge decisions around the use of their data, if they are adversely impacted. By failing to critically examine data trusts from a rights-based perspective, the Committee’s recommendation to adopt them point blank, at a national scale, seems hasty and under-developed. This move is also premature in the absence of a rights-respecting law on personal data protection. 

  1. Data Localization

While there are legitimate debates around the challenges associated with cross-border data flows, any localization framework which purports to safeguard the data of Indian citizens must, at a minimum, be developed in the context of a robust security infrastructure. Although a work in progress, India is simply not there yet, as demonstrated by the fact that it is the third most affected country in the world when it comes to cyberattacks and data breaches. Before any localization mandate is seriously considered, the government will need to do more to support robust local network security domestically.

  1. Vague Definitions of “Community”

The Committee’s definition of “community” is excessively broad and ambiguous, creating difficulties in determining exactly what commonalities would constitute a community under the proposed law, and what such a community’s collective rights and responsibilities would be towards the data generated by it. Would people of disparate backgrounds living in the same pin code constitute a community? Or would these delineations be based on profession, economic class or digital footprint? Would citizens have a say as to which communities they are classified into or would the state decide that? How would the Government define ownership and agency within the community over its data? How would it ensure non-discrimination in the governance of these communities? In what instances would it be fair for a government agency to act as a data trustee for a community, without invoking concerns of bias given the State’s sovereign authority? Given the vagueness of the concept, it is difficult to determine an appropriate framework to govern community data. These questions need to be clarified before the Committee’s work on this issue moves forward.

  1. Justifying the new NPDA

The Committee has also failed to adequately make the case for a separate NPDA, given that the proposed Data Protection Authority and the Competition Commission of India already possess sufficient authority and jurisdiction to determine issues of data sharing and data monopolies. The NPDA appears to replicate some roles from both of these entities, but with some inherent conflicts in its mandate, particularly with regard to the tension between extracting maximum value from data and fairly adjudicating disputes regarding appropriate levels of access. There are also general questions as to how the NPDA will balance public interests against narrower economic interests.

There are also challenges in defining the limits of the NPDA’s power, as well as the potential appeals process against its decisions. All of these need to be clarified by the Committee.


  1. The call for feedback should be extended by a minimum of three months, and should be extensively publicized, including on social media, local, print, television and radio news channels, in various languages, and in collaboration with district-level administration.
  2. The government should organize virtual consultations, broadcasted live through official social media and other digital platforms, to engage in an open discussion with academia, civil society, the private sector and any other interested parties to solicit their feedback. Feedback should also be accepted through postal mail.
  3. The Report should be hosted on the MEITY website, with translations available in different Indian languages.
  4. The Committee should clearly define the key elements of the “duty of care” owed to data principals by those entrusted with the authority to make decisions on their behalf, namely data trustees and data custodians.
  5. The Committee should clearly define the remedies available to a data principal in case of a breach of the “duty of care”.
  6. The Committee should carve out a stronger accountability framework for data businesses engaged in the conversion process, and provide data principals with stronger protections, to prevent harms from the re-identification of their data. 
  7. The Committee should consider eliminating the “sovereign interest purpose” category altogether, merging it into the “public interest purposes” category. It should also restrict the scope of government requests for “sovereign purposes”, allowing such requests only in exceptional situations, for time-bound purposes and subject to independent oversight, in line with India’s obligations under Article 12 of the UDHR and Article 17 of the ICCPR.
  8. The definition of non-personal data should be reformulated as a “public good”, to ensure better distribution of its socio-economic benefits among Indian communities, nonprofits and businesses, instead of being concentrated in the private sector.
  9. The regulatory process should be more inclusive, allowing communities a more direct say in deciding who gets access to their data and how it should be used.
  10. The Committee should provide stronger safeguards to protect communities’ traditional knowledge from exploitation by private players. 
  11. The Committee should carefully assess the suitability of data trusts as a solution to the problem of effective data stewardship within the Indian context. In situations where data trusts may not be fit for purpose, the Committee should explore alternative models that provide a higher degree of transparency and accountability.
  12. India’s data security infrastructure needs to be strengthened as a prerequisite to serious discussions around data localization, in order to avoid putting the nation at risk.
  13. The Committee should clearly define what constitutes a community, and how these determinations are to be made and applied.
  14. The Committee should clarify the role, powers, and procedures of the NPDA, including with regard to how this relates to existing institutions which operate in the same space.