top of page
Image by Annie Spratt




Critiquing the PDP Bill: Data Protection as a Market Measure for Anti-Trust Regulation

- Dr. Atul Jaybhaye and Sarthak Wadhwa

Over the past few months, the Competition Commission of India (CCI) delivered an order against Google wherein it fined Google Rs. 936.44 crores for abusing its dominant position in the Android devices market.[1] The National Company Law Appellate Tribunal (NCLAT) then denied Google’s appeal for an interim stay in this regard, preferring to take up final hearings on the matter in April.[2] This case has served as a turning point to demonstrate India’s strong stance towards regulating the market dominance of Big Tech, especially when it comes to the multifarious services that these platforms peddle. However, while the B2B market has seen leaps in regulation, one major obstacle reins the anti-trust machinery in the consumer-facing platform-based tech industry: data. The lack of a robust data protection solution can impair competition in the markets where data-banks run the show, dis-incentivizing new entrants and limiting constructive regulatory intervention.[3] As India slowly develops her legislative proposal for personal data protection, it is imperative to appreciate how the tech industry and the anti-trust regulatory environment will receive the Personal Data Protection Bill, 2022 (“the Bill,” hereinafter). This piece will conduct a stakeholder analysis – qua the interests of consumers, Big Tech players, and third-party competitors – from these perspectives to appreciate how the Bill can further or foil digital competition.

To this effect, the following sections will – first, identify how digital competition regulation can be furthered, and how data protection (or its lack thereof) can hinder the same; second¸ examine how the provisions of the Bill act as bottlenecks in this endeavour; and, third, broadly propose privacy-preserving pro-competition policy solutions to these issues.

0. Decoupling Data and Dominance

“Data is the new oil.” The digital economy has benefitted greatly from the immense and largely untapped value of data; the extraction of this value has not only revolutionized how we engage with the world, but has also fuelled the rise of platforms that mediate this engagement.[4] American tech giants (Amazon, Apple, Google/Alphabet, Facebook/Meta, Microsoft, etc,) have built formidable empires out of the network effects of data, allowing them to leverage their platforms into trillion-dollar fortunes. By bundling their multifarious services with each other, and using data as a through-line to transpose consumer bases[5] - these companies have monopolized entire service verticals, while resisting any disruptive competition.[6] Consequently, consumers are left with no suitable alternatives when these platforms expose them to predatory advertising,[7] data surveillance,[8] profiling,[9] and breaches of privacy. Even if disruptive competitors do arrive on the scene, data privacy is less of a guarantee and more of a pipe-dream.[10]

That these platforms continue to remain mainstays of the digital economy suggests one thing: they do not regard data protection as a concern in their market operations. Big Tech certainly clashes with regulators over these issues, but their services rarely advertise themselves as more secure than others to stand out.[11] Breaches of privacy should not be treated as inevitable technological failures, but as punishable market behaviour to elicit more responsible action from platforms. To this effect, this piece advances three critical measures through which to address these market failures: first, user-facing notice and consent requirements; second, platform-facing data localization requirements; and, third, market-facing interoperability mandates such as data portability.

1. Consumer Interests: Notice & Consent

a. The Evolution of Consent Management

Consent is foundational to any privacy-preserving data protection law: the consent to provide data, to dictate its use, to authorize further sharing, and to generate derivative meta-data from it is only meaningful when given on a particularized, limited, and time-bound basis. The ‘notice and consent’ model was first propounded in the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, adopted on 23 September 1980. It recommended the adoption of certain minimum standards of protection of privacy with regard to personal data such as: collection limitation, use limitation, purpose specification, data security and controller accountability.[12] However, the Fourth Industrial Revolution and the advent of the information age have rendered these broad stipulations moot. Couched in hyper-technical verbiage and masquerading as click-wrap agreements, notices seeking user-consent militarize user apathy to obtain unbounded, manufactured consent to collect, process and share user-data.[13]

In contrast with this trend towards swindling broad consent from users, India has developed a novel framework to facilitate market access to sensitive financial data in a privacy-preserving, consent-based manner. The Data Empowerment and Protection Architecture envisages the creation of a third-party consent manager to mediate the flow of user information between the information providers who collect it (banks, insurance, providers, tax platforms, etc.) and the information users authorised by the data principal (credit agencies, financial service providers, wealth managers, etc.).[14] The consent instruments conceived of by the DEPA embody privacy-by-design: information is shared on a particularized, limited and time-bound basis upon informed and revocable consent by the user.[15] The consent-management infrastructure proposed by the DEPA has already been put to use in the form of the Account Aggregator (AA) framework, to mobilize financial information to generate value for consumers.[16] Financial services have been made more vibrant, accessible and efficient due to the data-flows between users and providers of financial information, enabled by the AA-ecosystem which operates entirely on the basis of consumer-consent.[17]

The third-party consent-management node could help confer greater agency to the users of digital platforms insofar as their consent becomes the cornerstone for any data processing and sharing. Concurrently, the DEPA’s consent management model is pro-competitive as well insofar as it facilitates the deployment of new interoperable service-offerings that can utilize existing financial data. However, three policy limitations emerge in adopting this model more widely, especially for social media intermediaries and other Big Tech operations. First, consent management requires sophisticated regulation to ensure that the managers remain accountable to all stakeholders – the consumers/data providers, the firms/data users, and the regulator.

Second, the infrastructure can only mediate the transfer of data, and not the internal processing of data that has already been collected. Incumbent market-actors can essentially predate on the ideas that any potential competitor may develop by using their data, by limiting the data points that such competitor may have access to, and diversifying their own service-offerings with unlimited access to their own data banks. Third, the use of the aforementioned consent management model in the financial sector is founded on other security measures and regulations already in place with respect to financial data[18] – the model cannot be transposed to, say, data collected by social media companies unabated by similar restrictions. In effect, the re-imagination of notice and consent requirements requires wider policy consideration of how consent is conceived of by the data protection law, its potentially retroactive effect on existing data, and the affixation of accountability upon circumvention thereof.

b. Deeming Consent: An Oxymoron?

The Bill defines consent of the data principal as the “any freely given, specific, informed and unambiguous indication of the Data Principal's wishes by which the Data Principal, by a clear affirmative action, signifies agreement to the processing of her personal data for the specified purpose.”[19] Data fiduciaries may issue notices specifying the purpose for which they seek the collection of data, on the basis of which informed consent can be obtained – either directly through the data principal or a consent manager engaged thereby.[20] However, insofar as the processing of personal data can be reasonably expected, and such data is voluntarily shared – the data principal is also deemed to have consented to such processing of her personal data.[21] For instance, for the provision of State services and benefits, compliance with the law/any judgment, in case of a medical emergency, in the instance of a threat to public health, and to provide assistance in the event of a disaster or breakdown of public order – consent is deemed to have been given.[22]

However, presently, the Bill also deems consent to have been provided in employment related matters: whereas safeguarding corporate interests and intellectual property against espionage and infringement is a legitimate legal objective, extending deemed consent to matters of performance assessments, recruitments and termination has the potential of warping into corporate surveillance.[23] The negative psycho-social consequences of such surveillance is well-documented,[24] and allowing for employees’ right to privacy to be sub-ordinated to corporate interests does not bode well for the Bill. The drafters of the Bill do not appear cognizant of the manifest inequality of an employee’s bargaining position vis-à-vis the employer,[25] which is exacerbated by allowing the employer to assume deemed consent for whatever performance assessment metric is put in place. In effect, the fundamental right to privacy guaranteed by the Puttaswamy judgments[26] appears to have been diluted for the working class in a private enterprise. Insofar as these employee-interests are differentiable from Big Tech’s corporate and competitive interests, they have been flagged alongside consumer interests in this stakeholder analysis.

Further, in creating a wide public interest exception that encompasses privately ordered activities such as credit scoring and debt recovery,[27] the Bill not only undermines the elaborate consent management Account Aggregator framework established by the Reserve Bank for these purposes, but also exposes data principals to potentially predatory business practices without their explicit consent to participate therein. For instance, loan sharks and credit card salesman can feasibly access this data upon the deemed consent of the data principal to conduct their businesses insofar as it is ancillary to the aforementioned “public interest” purposes.

Even apart from this, “any fair and reasonable purpose” exception as contained in Section 8(9) of the Bill raises more pressing concerns. Data fiduciaries are allowed to process data for any purpose/public interest that they believe outweighs the adverse effect on the data principal;[28] no notice requirement is contemplated for such processing since consent is deemed on account of the principal’s “reasonable expectations” of such processing, and any relief may only be available ex-post facto when the harms of such processing may have already become incumbent. An omnibus exception to express consent thus has the potential to circumvent the notice requirements envisaged by the Bill, in cases where any opinion/preference is voluntarily shared on the data fiduciary’s platform. Insofar as they are imagined to be data fiduciaries as well, and no zero-knowledge proof mechanism is put in place against them, these concerns hold true even for consent managers engaged by data principals who may have access to different kinds of data about their clients.[29]

Abusive, privacy-circumventing practices may continue to fester behind the backdoor of “deemed consent,” meriting future reconsideration of the breadth of the provision. More broadly, it is relevant to note how the tacit acceptance of these circumventive data collection/processing practices through “deemed consent” further entrenches the dominance that Big Tech players enjoy – by allowing them to, first, obtain omnibus data collection consent in exchange for their multifarious services; second, overlay such data with existing data banks across service verticals to – compared to other competitors – more accurately produce user-profiles for more effective data processing and use; and, third, control access to such internal user-profiles and data in abusive exercise of such market dominance.[30]

2. Corporate Interests: Data Localization

a. Vocal for Local

The OECD Guidelines envisaged a world where the flow of information was unimpeded by national borders, recommending global minimum standards for data protection to insure against compromised security.[31] However, in the absence of regional specifications for data protection, Big Tech has profiteered off of the low compliance costs represented by these antiquated minimum standards. Big Tech was only spurred onwards to change in response to the European Union’s (EU) General Data Protection Regulation (GDPR) which completely reinvented privacy and data protection for the region.[32] Faced with a new regulatory regime and techno-legal grammar, Big Tech soon reformed its global policies to comply with the GDPR as the new minimum standard[33] (barring some notably frustrating exceptions – such as limiting the remit of these new privacy-preserving policies to the EU, and creating a weaker parallel policy for countries like India).[34] Notwithstanding, as more nations move towards enforcing municipal data protection laws, this global approach is facing greater tension, as regional compliance costs rise.

One of the ways in which data protection regulation is being localized is literally through data localization mandates that – first, impose geographic restrictions on the export of data to devices outside the borders of a nation/region; second, spur the development of data storage and processing infrastructure within these regional limits; third, limit the replication of data outside this local infrastructure; and, fourth, adopt particularized standards for the treatment of such data even outside these limits. The continuum that emerges as a result of combining different types of restrictions can help model jurisdiction-specific regulatory positions, with their own market peculiarities and compliance requirements.

Data localization has previously bothered Big Tech through a ruling of the Court of Justice of the EU (CJEU) – Data Protection Commissioner v Maximillian Schrems (and others)[35] (“Schrems II” hereinafter) – whereby sharing of European data with the United States under the trans-Atlantic Data Protection Shield was invalidated on account of American surveillance.[36] By imposing individual risk assessment requirements on every instance of data transferred to a non-EU nation, Schrems II exponentially raised compliance costs for Big Tech. As a natural consequence high compliance costs, oligopolistic barriers to market entry posed by globally recognized firms are alleviated in regional settings where local competitors find themselves at an equal footing. In other words, a direct correlation has been noticed between stronger data localization laws and greater local competition in the regional digital economy.[37]

b. Data Diffusion: Lack of Localization

Unlike other provisions of the Bill which have been significantly elaborated upon, the data localization provision has little to offer by way of content; the Bill does not settle on any combination of considerations flagged above, and cannot be located on the continuum of data flows. The Bill merely acknowledges that the Government may notify the countries to which data can be exported for storage and processing,[38] without expanding on the terms and conditions that may be put in place prior to such transport. Therefore, it is unclear what kind of data may be exported to other countries, how long such data can be retained abroad, whether such data can be processed outside, or if such data can subsequently be sold/transferred to such countries that the Government may not have initially notified. This lack of clarity about how cross-border data transfers may occur is concerning.

Apart from this, Section 18 of the Bill summarily subordinates the entire data localization conversation to the interests of sovereignty, public order and enforcement of the law – suggesting that strong data localization may be an exception to hitherto nebulous general practice. In other words, by flagging these extraordinary securitization interests, the Government has only denoted the outside boundaries of when data transfers may not be permitted. Presently, there is no framework for the internal regulation of how such transfers may take place (as flagged above) and data localization appears to be a security-driven exception more than a general rule. Without clarity on the specifics of the data localization framework, it may not be possible to assess the anti-trust implications thereof.

3. Market/Competitive Interests: Interoperability

a. The Privacy Paradox

Advocates for digital competition and innovation have long regarded the development of interoperable platforms as the as the way forward to address the market concentration enjoyed by Big Tech. Interoperability refers to a broad principle of mutual use whereby different software/ platforms/ devices/ other faculties are designed in a manner that their components/ protocols/ information is usable / readable across the board. For instance, the adoption of a standard type-C USB charger adapter for mobile phones allows for chargers manufactured by different phone manufacturers (and other third-party aftermarket enterprises) to work for all mobile phones.[39]With the backdrop of successful implementation of interoperability in internet communication (IP, HTTP, etc.), readable file formats (.mp3, .doc, etc.), and hardware compatibility (chargers, mice, healthcare devices, etc.), it is imagined that access to Big Tech data can be secured to third-party service providers by making the data-banks more interoperable. Scraping and aggregating of information has helped create valuable products in the form of search engines, booking portals, ad-blockers, news aggregators, and other application programming interfaces (APIs).[40]

However, at the same time, allowing free-reign over the information collected by Big Tech can have disastrous privacy implications as well.[41] Interoperability, therefore, comes at the cost of privacy – where the creation of derivative third-party services depends entirely on how much trust can be reposed in such third-parties.[42] Unsurprisingly, Big Tech has fanned the fears of third-party data manipulation to demand stronger privacy laws which restrict data access to third-parties,[43] without putting a check on its own internal, clandestine use.[44]

b. Mechanizing Interoperability through Data Portability

The conversation around interoperability is always supplemented with a discussion about data portability, and the autonomy it confers on data principals to determine the storage, use and transportation of their data. By allowing data principals to see all their data as collected and processed by a data fiduciary, and enabling them to retain, remove or transfer this data to another platform if they so choose[45] – data interoperability can help directly dismantle the network effects enjoyed by Big Tech by supplying their competitors with consolidated user profiles.[46] India has previously recognized the value of data portability and the competitive advantage that data confers to incumbent platforms; however, India has always been hesitant to recognize an actionable right to data portability, resorting to the ex-ante resolution of other existing concerns before further expanding the scope of data protection law.[47]

Conceivably, data portability can pose significant privacy and security risks insofar as it requires ease of access of hitherto protected data. Existing research has shown how the positive right to data portability as contained in the GDPR results in laxity in verification on the part of data fiduciaries who create these access channels under the law; concurrently, more restrictive provisions as contained in the California Consumer Privacy Act (CCPA) despite favouring incumbent platforms, offered more security to data.[48] Further, data points concerning different data principals may be embroiled together – whereby the data bank for one data principal may contain data shared by another, non-consenting data principal (say, contact information of Facebook friends).[49] The speciation of consent requests for such data portability may not only detract from user experience on these platforms but may also expose users to third-party risks that they may not have wanted to assume in the first place.

Notwithstanding, a reconceptualization of the right to data portability as vesting with the data principal as opposed to the third party service provider can help resolve the privacy paradox encountered above. The solution to third-party regulation does not lie in legislative design but in a wider regulatory environment where best practices and enforceable norms are developed upon multi-stakeholder consultation. Third parties compliant with industry best practices and data protection laws may be allowed access to data banks, so long as the data principal harboured by them has the right to that data (even if data related to another principal may be embroiled therein). Further, auditing the third-parties’ systems to ensure that such associated data is not used or processed aside from these specific use-cases can further alleviate the privacy concerns. To top it off, putting in place a strong opt-out mechanism to redact data outside a specifically authorized platform can allow the data principals to be in ultimate control of their data.

It is not constructive to vilify data portability as a leakage in the data protection systems of incumbent market dominators; it is perhaps more important to recognize that data portability can add value for data principals and should be fruitfully enabled in a privacy-preserving manner to empower consumers.

4. Conclusion

Compared to its previous iteration, the current draft version of the Bill is less aspirational – doing away with controversial provisions about cross-border transfers of sensitive data, and doubling down on the fundamentals of consent and accountability. This abbreviation of ambition is perhaps most palpably evident in the fact that the present version of the Bill has 30 well thought out provisions (with some place-holders for future consideration), rather than the enormous draft of 2019 with its daunting 98 provisions. Even so, it must be recognized that legislating on privacy and personal data protection has wide-ranging implications for an increasingly digital economy. While the provisions of the Bill are certainly interesting subjects of study by themselves, this piece underscores how the face of digital competition may drastically change with any changes put in place by the enactment thereof. There is a lot to appreciate the Bill for, apart from the critiques advanced above. But perhaps it is more important to understand the salience of this long public consultation process, and the incomprehensible range of issues the Bill needs to account for. In our considered opinion, the Personal Data Protection Act is still a long ways away.

[1]Press Information Bureau, “CCI imposes a monetary penalty of Rs. 936.44 crore on Google for anti-competitive practices in relation to its Play Store policies” (PIB, 25 October 2023) <> [2]The Hindu Bureau, “NCLAT upholds penalty on Google; sets aside certain directions issued by CCI” (The Hindu, 29 March 2023) <> [3] Lee Barrett, “Data security remains a challenge as interoperability moves closer to reality” (Chief Healthcare Executive, 19 June 2022) <> [4] Joris Toonders, “Data is the New Oil of the Digital Economy” (Wired) <> [5] Justin Fox, “How to Succeed in Business by Bundling – and Unbundling” (Harvard Business Review, 24 June 2014) <> [6] Florian Ederer, “Does Big Tech Gobble Up Competitors?” (Yale Insights, 4 August 2021) <> [7] Janae Sharp, “Facebook's Role in the Health Data Privacy Crisis” (Chied Health Executive, 23 Jube 2019) <> [8] “What is Big Tech’s surveillance-based business model?” (Amnesty International, 16 February 2022) <> [9] Seeta Pena Gangadharam, “The Dangers of High-Tech Profiling, Using Big Data” (The New York Times, 7 August 2014) <> [10] Haleluya Hadero, “Why TikTok’s security risks keep raising fears” (AP News, 17 March 2023) <> [11] Rick Braddock, “How Big Tech uses data privacy concerns for market dominance” (VentureBeat, 18 April 2022) <> [12] Organization for Economic Co-operation and Development, “Guidelines on the Protection of Privacy and Transborder Flows of Personal Data” (OECD) < > pp 14 [13] J. A. Obar, A. O. Hirsch, “The Clickwrap: A Political Economic Mechanism for Manufacturing Consent on Social Media” (2018) Social Media + Society 1-14 [14] “Data Empowerment and Protection Architecture: Draft for Discussion” (NITI Aayog, August 2020) pp 4 <> [15] Ibid ch 3 pp 30 [16] See: Master Direction- Non-Banking Financial Company - Account Aggregator (Reserve Bank) Directions, 2016; M. Rajeshwar Rao, “Regulatory Framework for Account Aggregators” (RBI Bulletin, October 2021) <> [17] Leslie D’Monte, “Account aggregators sectors set to soar” (Mint, 12 December 2022) <> [18] Master Direction- Non-Banking Financial Company - Account Aggregator (Reserve Bank) Directions, 2016 [3]- [4] (the registration requirements for AAs are particularized depending on the financial sector within which they seek to operate, with sector-specific regulators enjoy exclusive jurisdiction over their operations), [8]-[9] (the data security measures and tech-specifications for AAs must comply with those prescribed by the RBI’s IT subsidiary), [12] (coverage under the Ombudsman Scheme would obligate even AAs to set up nodal officers like other NBFCs), [14] (the corporate governance and audit obligations apply to AAs like any other corporate covered by the Companies Act, 2013) [19] Draft Personal Data Protection Bill, 2022, s 7(1) (hereinafter, “the Bill”) [20] Ibid, s 7(6) [21] Ibid, s 8(1) [22] Ibid, ss 8(2)-8(6) [23] Ibid, s 8(7); Ellen Sheng, “Employee privacy in the US is at stake as corporate surveillance technology monitors workers’ every move” (CNBC, 15 April 2019) <”-workers.html> [24] Kristie Ball, “Electronic Monitoring and Surveillance in the Workplace, Publications Office of the European Union” (Europa, 22 September 2019) <> [25] Guy Davidov, “The (Changing) Idea of Labour Law” (2007) 146 International Labour Review 311-320, 312 [26] K.S. Puttaswamy and Anr (I). vs. Union of India, (2017) 10 SCC 1; K.S. Puttaswamy and Anr (II). vs. Union of India, (2019) 1 SCC 1 [27] The Bill, ss 8(8)(d) & 8(8)(g) [28] Ibid, s 8(9) [29] Vallari Sanzgiri, “What Are The Consequences Of ‘Deemed Consent’ Provision In The Data Protection Bill?” (Medianama, 15 December 2022) <> [30] “House panel finds Big Tech’s ad business a ‘monopolist threat’” (The Economic Times, 23 Decemebr 2022) <> [31] OECD (n 11) pp 16 [32] Joshua Warner, “What’s the effect of GDPR and how is big tech responding?” (IG, 3 May 2018) <> [33] Marco Luisi, “GDPR as a Global Standards? Brussels’ Instrument of Policy Diffusion” (e-International Relations, 9 April 2022) <> [34] Monit Khanna, “WhatsApp Won’t Share European Users Data With Facebook Due To Strict EU Privacy Laws” (India Times, 14 January 2021) <>; c/f Karmanya Singh Sareen v Union of India, SLP (C) 804/2017 [35] Case C-311/18 [36] Mukesh Chandak, “Data Beyond Borders: The Schrems II Aftermath” (Thales, 2 March 2021) <> [37] S. R. Potluri, V Sridhar, Shrisha Rao, “Effects of Data Localization on Digital Trade: An Agent-Based Modeling Approach” (2020) 44(9) Telecommunications Policy [38] The Bill, s 17 [39] Natasha Lomas, “Europe seals deal on USB Type-C common charger rules” (TechCrunch, 7 June 2022) <> [40] Jason Harmon, “Opening the Door to OpenAPI & API Aggregator Benefits” (Spotlight, 29 September 2022) <> [41] Carole Cadwalladr, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach” (The Guardian, 17 March 2018) <> [42] Bennett Cyphers, Cory Doctorow, “Privacy Without Monopoly: Data Protection and Interoperability” (EFF, 12 February 2021) <> [43] Alfred Ng, “Lawmakers accuse tech giants of using privacy as a weapon to hurt competition” (CNET, 29 July 2020) <> [44] Gopal Ratnam, “Critics say Big Tech uses trade deals to avoid data privacy laws” (Roll Call, 21 March 2023) <> [45] “Data Transfer Project: Overview and Fundamentals” (White Paper) (DTP, 20 July 2018) <> [46] OECD, “Data portability, interoperability and competition” (OECD) <> [47] OECD, “Data Portability, Interoperability and Competition – Note by India” (OECD, 9 June 2021) [DAF/COMP/WD(2021)31 <> [48] Johannes Hammerling, “A comparative study on “the Right of Access” under the GDPR and the CCPA” (2019) Thesis No. LAGF03/20192, Department of Law, Lund University [49] Jason Kincaid, “Google To Facebook: You Can't Import Our User Data Without Reciprocity” (Tech Crunch, 5 November 2010) <>


Recent Posts

See All




bottom of page