top of page
Image by Annie Spratt
images-removebg-preview.png

NLSIR

|

Online

Gig Economy: A Tale of Algorithmic Control and Privacy Invasion

- Ankit Kapoor and Karthik Rai

Introduction


India employs 24% of the global online labour – the most by one single country.[1] This has resulted in the popularisation of the ‘gig-economy’, i.e., the provision of contract-based on-demand jobs by digital platforms to differently skilled labour.[2] In India, this is signalled by the growth of Zomato and Swiggy in the food-retail industry, Urban Company in the domestic services industry, and Ola in the transportation industry.


Many gig-workers prefer the flexibility their job provides. They may not prefer being categorised under a relationship of employment. Consequently, they cannot avail themselves of certain basic safeguards.[3] However, the technological architecture and socio-economic realities create heavy power imbalances between the worker and the platform/user. Therefore, by engaging in the gig-economy, these workers essentially exchange their privacy for opportunity, from the hiring process to their daily lived experiences. In a world of ‘Surveillance Capitalism’,[4] this has wide-ranging implications.


Accordingly, we argue that the data collection via surveillance of gig workers and algorithmic processing of said data cause multiple kinds of privacy harms, denuding workers of their autonomy and dignity. Given that the Indian legislative framework is woefully inadequate in dealing with such a framework,[5] a combined approach premised on values common to labour and privacy law is required to guarantee a right to privacy and algorithmic fairness to safeguard gig-workers’ interests.


The paper goes about proving this as follows: we first contextualise the stage, nature, and extent of privacy harms, as well as the harms from algorithmic control. then the effectiveness of the Indian legislative framework–both in labour and privacy law–in tackling these issues, is examined. Seeking inspiration from international legislative and policy attempts to ensure safeguards against the abuses described, we propose a normative legislative framework for the gig economy in India.


I. Understanding the Privacy Harms and Algorithmic Control


In this section, we will explore two aspects: (1) the harm caused by the mere collection of data by platform companies about the workers; (2) COVID-19-specific privacy concerns; and (3) the harms posed by algorithmic processing and the conclusions drawn therefrom. Importantly, (1) and (3) are quite connected, in that the algorithmic harms are premised on the data collected by the application. However, (3) goes beyond (1), in that not just the data processing, but specific manners in which the algorithm is designed are also controversial. Hence, they are addressed separately.


The nature and extent of privacy harms will be illustrated using Koops’ typologies of privacy.[6] The typologies provide an appropriate and comprehensive framework to identify the different kinds of privacy implicated and help in understanding the appropriate connections with labour law that the two aspects highlighted above pose. The relevant typologies are (i) intellectual privacy, which is the privacy interest in thoughts, mind, and development of opinions; (ii) decisional privacy, which is typified by intimate decisions; (iii) behavioural privacy, which is typified by the privacy interest a person has in determining their conduct in public spaces; and (iv) associational privacy, over an individual’s relationships, and who the individual identifies as.[7] However, we must first understand the kinds of data collected, as it is based on the data collected that the various kinds of privacy harms are implicated.


A. Type of Data Collected


The gig-workers are constantly tracked via the app. There is complete and granular knowledge of their whereabouts, time spent on the task, and rest taken. For instance, in Dunzo’s delivery partner agreement, alongside monitoring the gig-worker’s geo-location data when the services are being rendered, Dunzo may track said information for “safety, security, technical, marketing and commercial purposes”.[8]


While all the information on the worker is maintained on the platform, the worker is not even aware of the information about them being collected and its manner of usage.[9] Thus, there is an information asymmetry. This volume and granularity of data collection as well as asymmetry cause two implications: First, platforms can influence the workers’ behaviour. This is achieved by constantly tracking the partners, including detecting deviations from the assigned locations or departures from the app.[10] Zomato riders have, when they logged off the app, received calls warning them to log back or lose their incentives.[11] Thus, their surveillance not only achieves tangible commercial goals but also produces particular cultures which control behaviours and personal characteristics in subtler ways.[12] Workers also received nudges during their ‘off-duty’, private times, urging to be in the vicinity of a package, indicating extra-work data-veillance.[13] Moreover, tracking information about private life, such as time spent at home or frequency of breaks, is used to understand workers’ commitment levels.


Secondly, the workers are a constant source of monetisable data to the platform, without the former receiving any proportionate dues. Uber drivers continuously generate useful data by relaying data to the central platform, wherefrom inferences from data like frequency of brakes, speed of acceleration, traffic patterns and user demand can be drawn. This subsequently feeds into the algorithm-based price determination.[14] This collected data is useful to the platform for business planning, enhancing operations, accelerating decisions, maximising profits, or responding to dynamically changing environments.[15] The gathered insights can also be sold to other companies for advertising or other commercial purposes, generating further revenue.[16]


In addition to their existing socio-economic vulnerabilities, this imposes new forms of control over workers through the extraction and commodification of their data.[17] The extent of information collected has been further widened through the usage of wearables/fitness-tracking apps collecting health and psychological data.[18] This access to health data opens up new avenues for discrimination, especially gender-based. Salaries and job security are also becoming contingent on job scores. Associational privacy is also, consequently, implicated – platforms provide the entire name, alongside a picture, which has enabled users to extrapolate the caste and/or religious identity of the worker and discriminate against them based on this.[19]


This surveillance and monetisation violate intellectual and behavioural privacy. The primary issue with surveillance is that it impacts an individual’s ability to keep information about themselves private by allowing ‘unwanted access to the physical self’.[20] This causes severe stress and demoralisation, alongside a chilling effect on their intellectual faculties, as workers are now extremely cautious in their public behaviour.[21] Even the liberty of workers is constrained by forcing them to stay in particular places.[22] The worker no longer can manage the boundaries of their privacy.[23] This is especially since the delivery partnership agreements often allow termination if statements detrimental to the platform are made.[24]


B. COVID-19 Specific Concerns


Platforms extensively collected the health details of workers, both voluntarily and involuntarily. Two practices stand out: (1) workers were required to upload a daily temperature reading, and a selfie of themselves to prove that they are wearing a mask[25] (2) workers were mandated to share vaccination details with the platform, which made them public to the users.[26]


This health data collected by the platforms, alongside the Aarogya Setu application, can be used to allocate a quantum of work to individuals based on the platforms’ best interests. Additionally, the usage of the Aarogya Setu application was effectively made mandatory by the platforms for their workers by linking such usage to either their pay/incentives or withdrawing access to their IDs on the app.[27]


These have an implication for the behavioural privacy of workers as they are forced to artificially mask their symptoms, even though they might not have any correlation with COVID-19.[28] This imposition is problematic for various reasons. First, platforms now have access to an even greater pool of health data. This allows for easier termination if workers self-report any symptoms of the app.[29] Secondly, the integration of Aarogya Setu would make platforms continuously privy to worker information, even post the pandemic, as against surveillance only when workers use the app.[30]Thirdly, the Bluetooth data from Aarogya Setu can be used to identify the movement of individuals or their collectivisation, thus violating their associational privacy.[31]


C. Algorithmic Control


Algorithmic control can be understood by four characteristics.[32] There is constant tracking of workers’ behaviour through data; there is constant performance evaluation of the workers through the gathered data; there is an automatic implementation of decisions, with little or no human intervention; and the workers’ interaction is with a system, rather than with humans. Given these characteristics, there are some issues that emerge:


i. Opacity and Bias

There is opacity over the data collected, the source code, and its algorithmic deployment. This can be due to intentional secrecy, lack of technical literacy, or machine-learning opacity. Platforms provide a vague list of factors, like loyalty towards the platform or feedback from the customers, that dynamically influence the algorithm.[33] However, for proprietary reasons, they refuse to divulge specific information.[34]


The consequence is that, while the app collects extensive data about them, workers are not aware of how the algorithms are being used to direct, evaluate, and discipline them.[35] It also comes with ‘disintermediation’, meaning that in the absence of human managers, workers do not receive an explanation when their work is rejected, they receive a bad rating or are terminated.[36] This experience is incredibly frustrating and confusing for drivers. This issue is particularly compounded due to a lack of grievance or dispute redressal mechanism.[37]


Additionally, algorithmic management is not value-neutral. Algorithms are not merely encoded with technical information on rules and routines but also create and represent the interests of powerful actors (i.e., platforms). The biases integral to structural arrangements of work are not eliminated, merely transferred to the platform.[38] Thus, the intersectionality of class, caste, age and gender influence the worker’s experience. This is in terms of conclusions the algorithm draws about them through surveillance or the ratings that users provide.[39] This is because the use of algorithms can worsen pre-existing biases or generate new ones since the codes are ultimately written on instructions of human programmers, who are unbiased themselves. Moreover, the data fed into the algorithm may be biased, especially if taken from incorrect sources.[40] These features lead to gig-workers’ bodily privacy being violated, as this disintermediation creates uncertainties about how gig-workers should conduct themselves at work.[41]


ii. Ratings

Platform companies’ apps operate on a ‘user-ratings for workers’ system, wherein the consequence of bad ratings can be deactivation from the platform or re-training or mere penalisation. The identification of ‘bad’ and the ensuing consequences differ across platforms. Importantly, the ratings system is effectively one-sided, as workers’ ratings of customers have little consequence, leading to structural domination of the platform over the workers. Moreover, the user might expect greater service at the same cost or even service off the platform, while the latter expects strict compliance with its rules. Thus, workers need to amplify efforts to satiate both these criteria, often at odds.[42] It is also difficult for workers to understand or challenge their assessment since external ratings have erratic time intervals between service delivery and ratings.[43]


The fear of consequences coerces workers to accept undesirable work, even at undesirable timing. It may coerce them to forego workplace safety or report harassment from the client. A masseuse on Urban Company reported being blackmailed by a customer to provide extra free service under the threat of a bad rating.[44] The issue compounds for deactivated workers since they do not receive any notification or warning.[45] The ratings also cause uncertainty, dissatisfaction, and anguish among workers. They signal to the worker that their behaviour is being observed through numerical metrics.[46] This incriminates behavioural privacy as the worker needs to alter their public behaviour to conform to the platforms’ and users’ expectations.


There is also scope for gender and racial stereotyping as external identity informs the rating on the platform,[47] especially since users have no accountability over the ratings they provide.[48] It can also lead to the creation of a ‘scored society’, where data about every aspect of the worker is collected and then used to assess their worthiness for different tasks.[49]


iii. Gamification

Platforms have relied on gamification, a form of algorithmic rewarding, to incentivise workers by converting competitive work into a game-like activity. This is done by displaying a weekly scoreboard of top performers to increase intra-worker competitiveness,[50] sending them ‘automated nudges’, or adjusting performance benchmarks based on real-time progress.[51] While these incentivise workers, they create punitive and non-transparent work environments that disproportionately pressurise workers to deliver onerous benchmarks. It compromises workers’ ability to consciously determine moral and practical limits for their work. In this way, the platforms manufacture workers’ consent for additional work, violating the decisional autonomy that permeates privacy.[52]


iv. Dehumanisation

Algorithms control several aspects of the workers’ life, from allocating work to evaluating it. The inability to interact with people and build meaningful relationships with them during this process causes alienation among the workers.[53] They are further alienated when they lose control over work and refused autonomy over their actions. For instance, for an Uber/Ola driver, the navigational application substitutes their need for a localised understanding of routes.[54] This can wither the workers’ mastery of their skills, forcing them to resemble mechanical robots.


Even while receiving incentives, workers are severely bounded by conditions imposed by the platform. In this way, they maintain control over the workers’ manner and timing of work while sustaining the idea that the workers enjoy total freedom over their schedules.[55] The constant surveillance reduces the opportunity for workers to engage in personal development and growth. This is because the surveilled scenario prevents them from thinking out of the box. Instead, they must act in terms of rule-based compliance.[56] This impacts the intellectual privacy of workers as they are unable to exercise their mental and physical faculties to develop and express themselves.


The difficulty in creating empathy between the worker, platform, and user has resulted in this situation also being termed as ‘inadvertent algorithmic cruelty’.[57] Ultimately, algorithms are merely codes that do not accommodate human emergencies. The data collected is also standardised, thus reducing the opportunity to collect contextual data about workers. This has severe implications for workers’ rights.


For instance, Uber deducts drivers’ earnings if they have erred. If a passenger complains that the driver took an ‘inefficient route’, then there is a need to verify this using contextual knowledge of the presence of physical obstacles invisible to navigation systems, or whether the request was made by another passenger.[58] However, this independent verification doesn't usually happen, and the worker is punished for doing the right thing. The surplus labour has resulted in the platform being designed to prioritise the workers’ exit over initiating dialogue with them. However, platforms leverage the high initial investment made by workers to ensure they remain on the platform and in the process dehumanise their contributions.[59]


D. Why this is a Labour Issue: The Common Goals of Labour and Privacy Law


The discussion till now has proceeded through a privacy-law lens. What, then, would the connection with labour law be?


In the discussion on privacy harms above, we observed how worker behaviour is influenced through data-veillance; workers are discriminated against based on the data collected; and how algorithms can arbitrarily deny workers a fair right to earn wages and defend themselves against unreasonable terminations. Such disproportionate and non-transparent surveillance and algorithmic processing violate intellectual and behavioural freedom and deny workers a fair right to living. We clearly see an overlap between this concern, and how labour law tackles the problem of arbitrary terminations, protecting associational freedoms such as unions, and providing fair wages based on transparent standards.


Guy Davidov highlights that labour law was intended to achieve multiple goals: first, there is a systematic vulnerability due to the subordination and power imbalances permeating employment relationships and employers’ prerogatives to impose unilateral decisions on workers, reducing worker autonomy.[60] Additionally, there are distributional implications for labour law: there are multiple market failures such as information asymmetries. Labour law can, inter alia, promote trust in employer-employee relationships, achieving ‘workplace democracy’. This leads to the workers’ dignity being upheld.[61]


Privacy and data-protection law serve to achieve the same values. When information/data is shared with data-fiduciaries, we are vulnerable and trust them to protect it by being honest and transparent in processing data. This trust is ensured by guaranteeing privacy legislatively, for instance, which consequently contributes to reducing vulnerability.[62] Similarly, privacy law facilitates autonomy as it ensures that people go about their business freely without the fear of being continuously policed/monitored. By furthering these values, it leads to self-development and upholds individual dignity,[63] by allowing individuals to negotiate their personal and professional relationships.[64] Privacy and labour law can thus work in tandem, where either law, depending on the context, can be useful in guaranteeing the values that each legal regime strives to uphold.


That both the right to privacy and the right to employment are ‘founded on human dignity’ was recognised by our Supreme Court.[65] The ILO’s Code, too, implicitly seems to recognise the fundamental overlap between labour and privacy law’s goals.[66] Thus, while the rights of the platform company to direct work should be upheld, we need to limit the discretion they possess and redistribute power in such relationships. Concretising privacy rights for gig-workers can thus achieve the goals that labour law wishes to secure.[67] It is with this objective that the discussion below proceeds.


II. Examining legislation: Privacy and Labour Law


So far, we have viewed the extent of data collection perpetuated by platform companies, the extent of their deployment and processing by algorithms, and the consequent harms to specific facets of privacy and autonomy. it now becomes pertinent to examine whether there are adequate safeguards available in India’s labour and privacy laws, both extant and upcoming, to protect against such excesses.


A. Labour Laws


Currently, no law governing labour relations contains safeguards protecting any worker from privacy violations, let alone gig-workers.[68] Though the right to privacy has been concretised as a fundamental right, this claim would not be available against private-sector employers like platform companies. However, there is a potential to institute proceedings based on common-law principles such as defamation and breach of confidence, which can serve to protect against privacy violations to some extent.[69]

An examination of the social security safeguards available to them is also important since that will also have an impact on the privacy rights of gig-workers, as shall be demonstrated below. Existing social security legislations require a relationship of employment to be proved, apart from other requirements such as there being an ‘establishment’.[70] As for the Unorganised Workers’ Social Security Act (‘UWSSA’), gig-workers may not satisfy the meaning of ‘unorganised workers’ or ‘self-employed workers’, as the numerical thresholds provided therein could be easily manipulated.[71]

the upcoming Social Security Code, which combines the ESI and EPF laws, has recognised gig and platform workers as persons taking part-time jobs outside “traditional” employer-employee relationships. It mandates the gig-workers’ registration[72] and empowers the Central Government to formulate welfare schemes pertaining to accident insurance, old-age protection, etc.[73] However, it hasn’t gone far enough to provide them with specific entitlements to social security.[74]


B. Privacy and Algorithmic Fairness Laws


B.1. The IT Act and SPDI Rules

The Personal Data Protection Bill, 2019 (‘PDPB’) underwent a review by the Joint Parliamentary Committee (‘JPC’), which prepared a Report (‘JPC Report’). The JPD Report recommended the creation of the Data Protection Bill, 2021 (‘DPB’). However, the Bill has not been finalised into law as of now.[75] Consequently, the major source of data protection and privacy is the Information technology Act and the 2011 Rules on Sensitive Personal Data or Information (‘SPDI’) thereunder. The IT Act protects against personal information being intentionally disclosed in breach of contractual agreements.[76] Besides, ‘body corporates’ such as platform companies, that do not implement ‘reasonable security practices’ in processing SPDI – which they require for maintaining employee records and other purposes – will be penalised.[77]


The 2011 Rules were framed to outline these ‘reasonable security practices’ and define SPDI very narrowly, as comprising sexual orientation, passwords, biometrics, inter alia.[78] It requires a clear privacy policy outlining the data collected, and consent must be obtained for processing such data. Only so much SPDI, as is ‘necessary’ and ‘connected’ to the employer’s activity, shall be collected.[79] There is potential scope for opting out of the collection of data by the employee, and also for reviewing the SPDI the employer possesses.[80]


However, these Rules may not be very effective in the platform context. Certainly, they have protected against executive abuses in the past – for instance, when the Karnataka Government passed the Aggregator Rules pertaining to the collection of sensitive information that could be inspected by the state ‘at any time’, the Karnataka High Court struck them down on violating the fundamental right to privacy and the SPDI Rules.[81] However, a lot of the data collected by such corporations is in the form of metadata (data about data, such as GPS information) and non-sensitive information, which are not regulated by the SPDI Rules.


Perhaps the gig-workers could take recourse under common law, as was done in NeeraMathur, where disclosure of details about pregnancy in the context of employment was deemed “embarrassing” and struck down.[82] However, this decision does not provide a concretised entitlement to privacy for employees/gig-workers. Thus, the SPDI Rules do not provide adequate safeguards for protection against most of the data that platform companies process from gig-workers, and they have no entitlement to the fair processing of such data.


B.2. The DPB and NPD Framework

The forthcoming Data Protection Bill (‘DPB’) provides broader protection than the SPDI Rules. There have been reports that a new bill may completely replace the existing DPB.[83] However, this issue is merely under discussion, and until then, the existing (proposed) framework is the DPB.


The DPB governs the processing of “personal data”, which includes any data directly/indirectly attributable to a person, also including inferences drawn from profiling gig-workers.[84] Data collection/processing must comply with principles of providing prior notice of collection, limiting data collection to the purpose at hand, storing data for only as much time as is legitimately required, processing only accurate information, and securing the ‘free’ and ‘informed’ consent of the data principal (who here is the gig-worker) before data-processing.[85] However, there is an exception for processing data without consent, if the data collected is required to assess employees’ work performance and to recruit/terminate employees, inter alia.[86] Interestingly, whether this exception extends to gig-workers (who have not conclusively been categorised as “employees” yet) or not is unclear, though the assumption has been that it is extendable to them.[87]


The JPC deliberating the PDPB (now the DPB) identified that employees’ data has to be handled sensitively given the relationship between them and the employers. It suggested that, if the data principals (such as employees) ‘reasonably expect’ the employer to process data without consent, or is ‘necessary’ for the same, then personal data can be processed without consent.[88]


Despite this concern indicated by the JPC, what the employee ‘reasonably expect(s)’ is extremely subjective, and the phrase can be interpreted by employers such as platform companies to authorise the collection of troves of personal data without consent under the garb of ‘reasonable expectation’. Considering the heavy power imbalances existing in such relationships, the phrase could play to the employer’s interests more than the gig workers’. Consequently, this safeguard is not sufficient.


Besides, data can be processed without consent for ‘reasonable purposes’ like fraud prevention.[89] Thus, the DPB gives broad scope to platform companies to process personal information about gig-workers under the guise of assessing their performance, to draw conclusions about them which cannot be challenged. This poses significant threats to the dignity and a reasonable expectation of privacy of the workers.[90] There is a vague obligation on data fiduciaries like platform companies to process data in a ‘fair manner’;[91] however, it can be conveniently interpreted to favour platform companies’ opaque algorithmic processing.


There may be some positive takeaways, however. For instance, the gig-worker can seek access to the data profile about him/her processed by companies, and this has to be provided in a transparent manner.[92] Inaccurate or misleading personal data can be corrected, and unnecessary data can be removed. In case this request is denied, gig-workers can indicate alongside the contentious information that the accuracy of the information is disputed.[93] Moreover, any rejection of requests by the platform companies can be challenged before the Data Protection Authority under the Act.[94] But how these rights will be operationalised remains to be seen, given the digital illiteracy of gig-workers, the non-transparency in operations of platform companies’ algorithms, and the difficulties in including such additional information (as the fact that the information is disputed) on app interfaces.


As for Non-Personal Data (‘NPD’), the JPC Report stated that NPD, which refers to ‘data other than personal data[95] should be brought within the DPB’s ambit. There would be a separate regulatory framework for NPD, but within the DPB itself, and not separately.[96] The scope of the same is not clear at the moment.


However, before this, there have been two reports in 2020 seeking to establish a framework to regulate NPD. These may indicate the stance that the government would prefer to take on the framework that it would envision for NPD under the DPB. NPD, or information not comprising personally-identifiable information – such as anonymised information – includes the metadata that platform companies collect from gig-workers, inter alia. The NPD Reports aim at maximising NPD processing and commercialisation, but they completely override privacy concerns owing to the possibility of de-anonymisation of data to link it to specific workers.[97] This plays into the surveillance capitalism narrative, something gig-workers will be disproportionately affected by – for instance, if datasets containing unfair conclusions drawn about them are sold off to third parties.[98] Besides, the revised Report on NPD categorises inferred and derived data to be private NPD, where private corporations (such as platform companies) would have ownership of the intellectual property over them.[99] This means that there is a dearth of entitlements regarding workers’ collective rights over controlling the narrative that such inferred data may present about them.


B.3. Safeguards against Algorithmic Fairness

The SPDI Rules or the DPB do not provide for increasing algorithmic fairness, or for safeguards against automated processing.[100] The DPB merely provides for access and portability of data generated via automated means, unlike the EU’s GDPR which provides for greater safeguards, as is detailed below.


However, the Motor Vehicle Aggregators Guidelines, 2020 (‘2020 MV Guidelines’), which were formulated under the Motor Vehicles (Amendment) Act, 2019, impose obligations on ride-hailing platforms like Uber regarding data storage, inter alia. They are intended to facilitate the ‘regulation’ of ‘aggregators’ and hold them accountable for their operations.[101] It also requires the ‘functioning’ of ride-hailing companies’ algorithms to be transparent, but guarantee a proper right to an explanation of automated decisions.[102] However, what exactly amounts to transparency is unclear. Besides, there is no mandatory requirement for the explainability of algorithms that facilitate decisions on the price, and rating of drivers, among others. Hence, the effectiveness of the 2020 MV Guidelines seems unclear at the moment.


Under labour law, the Industrial Disputes Act and the Industrial Relations Code, proscribe inter alia employers from perpetrating ‘unfair labour practice(s)’.[103] Assuming that the employer-employee relationship is established in the gig-worker scenario, the Fifth Schedule, which outlines the exclusive list of unfair trade practices,[104] would still be insufficient in tackling algorithmic excesses perpetuated by platform companies. For instance, regarding the complete opacity in algorithmic decision-making during employment, inflicting insecurity and fear on gig-workers, there is no corresponding entry in the Fifth Schedule.


As for unfair dismissals by algorithms, dismissal “inutter disregard” of natural justice is an unfair labour practice.[105] However, the Bombay High Court opined that ‘utter’ must be interpreted to avoid redundancy, and mere technical disregards of PNJs might not fall under ‘unfair labour practices’ – only blatant disregard of PNJs would.[106] Thus, there is a possibility for platform companies to circumvent a violation of the Schedule by proving their unfair dismissals were only technical non-compliant with PNJs. Thus, current labour and privacy laws do not safeguard algorithmic fairness too.


C. International Jurisprudence


Five sources are examined here – the international Labour Organisation, the EUGDPR, the EC Directive on Improving Working Conditions in Platform Work, 2021 ['the 2021 Directive’] collective bargaining agreements, and case law directly on this issue.


C.1. ILO

The ILO has not had any conventions or significant literature produced on gig-workers’ privacy rights. However, it formulated a non-binding Code of Practice for Protection of Workers’ Data in 1997, which can afford some guidance on labour law’s conceptions of privacy. Grounded in the need to preserve workers’ dignity,[107] it states that personal data-collection should be in clear terms (cl.6.3). The storage and coding of worker data should not ascribe discriminatory attributes to workers (cl.8.6), which could be used against opaque algorithmic-processing. Application of proportionality principles means that the least intrusive data-processing methods must be employed, with workers having information about how long and for what purpose the processing shall continue (cl.6.8) – thus, continuous monitoring leading to ‘psychological distress’ should be used only exceptionally, such as for health-related reasons.[108] Other rights pertaining to access, and correction of incorrect data, are also provided (cl.11).


There are also safeguards that can directly protect against opaque algorithmic processing. The Code states that the personal data gained via surveillance must not be the determinative factor in assessing workers’ performance (cl.5.6.). In case of inferences derived from personal data, workers can add their own views to such conclusions to provide a complete picture of such inferences (cl.11.12.). A forum for challenging employers’ adherence to the Code should also be available. this rejection of ‘mechanical’ decision-taking is premised on algorithmic due-process considerations.[109]


C.2. GDPR

C.2.1. Posited Law

The GDPR includes more safeguards than India’s DPB. It allows for states to have specific rules to protect employee privacy for purposes ranging from recruitment to termination to ensuring health at work, to guarantee worker dignity.[110] Apart from similar rights as data portability, access, and correction as guaranteed in the DPB, the GDPR has a consent requirement, unlike the DPB for processing employee data. Legislative debates during the GDPR’s formulation recognised how even this consent might be a façade owing to the differences in bargaining power. In fact, Recital 43 of the GDPR adopts this view, when stating that in case of such imbalances, consent should not be the sole grounds for data processing.


The GDPR thus emphasises employers’ ‘legitimate interests’ as a ground for data processing in such cases. Despite this being a subjective ground, it would include reasons like protecting work efficiency and intellectual assets. However, the ‘legitimate interests’ should not override employees’ fundamental rights and freedoms.[111] Thus, it emphasises the proportionality of data processing, and employers cannot process data just because it serves their economic interests.


Additionally, the GDPR provides an entitlement not to be subject to decisions solely on automated processing, something the DPB does not provide.[112] The exception to this is if it is needed for the conclusion/performance of agreements, but there too, safeguards like human intervention and scope to challenge the decision are provided.[113] Thus, the GDPR parallels ILO’s Code and it would be more difficult than under the DPB for platform companies to conduct disproportionate data collection, have unilateral algorithmic dismissals and controls over gig-work, or monetise employees’ data sets. An improvement in India’s legislative framework should account for these differences.


C.2.2. Case Law

A couple of decisions by the Amsterdam District Court on gig workers’ privacy and access to data and algorithmic processing, all passed in 2021, provide guidance on delineating gig-workers’ rights. In the first-ever decision on this issue,[114] the Court ruled that Uber’s dismissal of workers on grounds of ‘fraud’ solely on automated means violated the proscription against the same under the GDPR was illegal. Consequently, it reversed the dismissal and ordered payment of a fine and the reinstatement of the drivers’ Uber accounts.[115]


A subsequent decision by the same court involved drivers seeking access to personal data from Ola and Uber, and a claim regarding unlawful algorithmic dismissal. In Ola, the Court ordered Ola to allow drivers to inspect their ‘ratings’ history which determined the quality of the fares they received and their potential dismissal. It noted that this was personal information under the GDPR, and thus access had to be provided in an anonymous format to protect passengers’ privacy.[116] Ola was also found to have utilised solely automated means to dismiss workers. Therefore, it was ordered to provide access to information about the underlying logic behind the termination[117] – such as how it processed personal data to arrive at a fraud probability score,[118] and an ‘earning profile’ used to calculate bonuses.[119] As for the ‘Guardian’ system Ola deployed to conduct worker surveillance and ‘detect irregularities’, the court ordered that personal data used in this system must be accessible to drivers.[120]


In Uber,[121] the court held that drivers seeking to collectively access their data were not an abuse of their rights, and ruled that they could establish a gig-workers’ data-trust.[122] the Court rejected most requests by the drivers to access various categories of data, such as for access to driving ‘profiles’, and the data (like location, rejected requests, etc.) used to penalise drivers.[123] The Court observed that the applicants had not ‘sufficiently’ indicated what kinds of personal data they needed access to.[124] It also denied their requests to access individual passengers’ ratings of their rides and individualised feedback they provided, to protect passengers’ privacy.[125]


Uber was also found not to use solely automated decisions while terminating drivers,[126] and the court placed the burden of proving the same on the drivers.[127] simultaneously, it was ordered to provide two drivers dismissed for ‘fraud’, the data on which it made its decisions[128] since it had not clarified which fraudulent activities by them had caused their dismissal.


Viewing the reasoning in these cases helps understand the kind of entitlements India must have while protecting gig-workers. Transparency was majorly emphasized in Ola, and this transparency in ratings and automated dismissal would help workers ascertain if they were discriminated against by the algorithm. Besides, the rights of gig-workers to unionise and establish data-trusts were also recognised. However, the judgement failed to adopt an entirely pro-worker stance – it placed an undue burden on the workers to show they were subject to automated decision-making (in Uber) and also to specify the kinds of data they wanted access to. However, the informational asymmetries in such labour relationships and the opacity in platform companies’ labour practices prevent workers from discharging this burden. These considerations should be factored in while proposing a normative framework.


C.3. The 2021 Directive

To provide for better conditions in the platform economy, the Directive was proposed in December 2021. Among other things, the Directive provides clarity on whether a worker in the platform economy would be an ‘employee’ of the company so that they can get the benefits of the ‘working conditions’ that employees are entitled to.[129] It presumes a relationship of employment in case of certain elements of the work being algorithmically controlled.[130] The chapter on algorithmic management requires the platform to inform workers about the ‘automated monitoring systems’ that monitor and evaluate the platform workers, including information such as the parameters considered while taking decisions. [131] They also have to inform the latter about the ‘automated decision-making systems’. All this information should be provided to the representatives of platform workers and labour authorities as well.[132] Additionally, such information that is not strictly required for the performance of the contract is not to be collected, such as the information on the platform worker’s psychological wellness, or information about the worker’s post-work activities.[133]


The Directive also requires humans to monitor decisions based on algorithmic processing, ensuring that such automated decision-making does not impact the mental or physical health of the workers, or put ‘undue pressure’ on them.[134] Besides, decisions based on algorithmic processing can be questioned and explanations can be sought from a human intermediary of the platform company about the facts and reasons behind the decision.[135] If platform workers think that their rights have been violated, they can request the platform to review the decision, which has to be done forthwith, or compensation should be provided in case of infringement of rights.[136] Additionally, workers/their representatives should be briefed about the introduction of changes in or additions to, the automated decision-taking system.[137]


Many of the standards mentioned in the Directive are extremely subjective – for instance, a bar on using algorithms that put ‘undue pressure’ and affect ‘mental health’ could include every algorithmic tool the platforms use; similarly, how to decide if the algorithmic decision ‘infringes the platform worker’s rights’, thus necessitating compensation/rectification, is also uncertain. However, the 2021 Directive is a great step forward in recognising the rights of platform workers, especially in the field of algorithmic management and data collection. It includes important inputs that should inform the normative framework.


C.4. Collective Bargaining Agreements

Apart from statutes, labour unions have previously entered into collective bargaining agreements to protect employer privacy against digital technologies. Belgium’s National Collective Agreement (2002) brought about greater restrictions on surveilling online communication. collective bargaining agreements in Sweden (2014), Argentina (2015) and France (2016) ensured that the stress and fear caused by adopting technological surveillance in the workplace should be diminished, and included safeguards against the same, such as a ‘right to disconnect’ in France, to avoid burnouts.[138] A collective bargaining agreement between Norway’s Confederation of Trade Unions and Confederation of Businesses agreed on proportionality in workplace surveillance, and that any new measures should be introduced only post consultation with the Union.[139]


The first-ever collective bargaining agreement for gig-workers was entered into in 2018 by a Danish Trade Union with a platform company providing cleaning services, which also had clauses pertaining to data protection. The ‘Protocol’ stated that publication of personal data on its platform should be based on explicit consent and that ‘false’ comments can be removed from each worker’s profile when requested.[140] These agreements provide significant inspiration to fill in the gaps in India’s law.


III. Proposing a Normative Legislative Framework


A. Envisioning a platform sector-specific privacy law


Though the DPB will soon be legislated, even if its identified loopholes were filled, it would be inadequate to deal with the specific perils of the gig economy[141] for many reasons.

One, due to power differentials in the relationship, the consent framework, which forms the bedrock of DPB, breaks down due to contractual and technological reasons.[142] Contractually, clauses for the purpose of data collected are phrased broadly and vaguely enough to extend their application to future events, which users never foresaw.[143] The volume and frequency of data collected through technical standard form contracts, along with the sheer volume of such contracts, results in a ‘consent fatigue’ among data subjects. This is specifically compounded given the illiteracy of most gig workers. Thus, it is unfeasible to expect extensive knowledge of contractual terms from them.[144] Technologically, consent is breached due to the interoperability of modern databases that allow the interaction of distinct datasets to produce unique and powerful insights.[145]


Two, as the ILO identified, the scope for data processing in such relationships is higher than in any other sector,[146] spanning demographic, health, and employment indicators. As indicated earlier, this data is used in disparate situations, some of which are detrimental to gig workers. This problem is compounded given the state of machine learning algorithms, which are excellent at spotting patterns and profiling data.[147] Therefore, there is a need to contextually modify the understandings of entitlements.[148]


Three, a general privacy law like the DPB frames data-protection entitlements at the ‘lowest denominator’ level, which the gig-workers may not benefit from. For instance, in Uber, the court denied portability of data generated through ‘data-analysis’ and of the personal data used to make price-fixing decisions, noting the GDPR’s non-applicability to these requests. It also imposed the burden on the workers to request specific data, not recognising the disproportionate impact this burden would place on workers.


A privacy law that seeks to protect the interests of labour should seek to guarantee the goals common to both labour and privacy law – that is, trust and autonomy have to be secured for the gig-workers. However, existing privacy/algorithmic-processing laws are not fully sufficient, since their orientation is more towards a lowest-denominator of privacy. Given the precarity and informational asymmetries gig-workers face, the values that privacy law should ideally pursue may be controverted by employees. Thus, to ensure that these laws better serve their goals and thus provide stronger protection along the lines that a labour-oriented law would require, a law preserving gig-workers’ privacy and preventing unfair determinations at their expense must provide greater safeguards. Alternatively, the existing law should perhaps be interpreted with nuance and specifications, as described below, while dealing with the gig-sector.


A.1. A right to privacy

Thus, a ‘right to privacy for gig-workers’, of both personal and even metadata, must be explicitly guaranteed. Principles of data minimisation, transparency, and proportionality have to be narrowly tailored for this sector.[149] In the context of food-based platforms like Zomato, for instance, only such data as required for background checks and business purposes must be collected. Information for “marketing, research, and any other purpose as Zomato may deem fit”, as Zomato’s agreement with Delivery Partners currently reads,[150] must be read down. Therefore, purposes like monetisation of data sets that platform companies can indulge in must be proscribed, or alternatively determined by workers.[151] Besides, some data-collection practices like Uber’s infamous ‘Hell’ program, where it surveilled gig-workers who worked also with its rival Lyft and then pressured/coaxed them to abandon Lyft,[152] should be altogether banned. Processing data to determine worker incentives based on worker rankings could be prohibited, and flat payment rates could be devised.[153] Processing of sensitive information like health-related information should be banned unless it satisfies an extremely high requirement threshold, to prevent misuse of such data in the future. That is, the processing must be absolutely essential for the effective rendering of services by the platform.


For the workers, there should be a right to access all personal and metadata relating to oneself,[154] including conclusions/inferences drawn based on data. Differing from the Uber decision, the burden of specifying specific categories of data to be accessed should not be on the worker. The burden of proving a ‘significant’ countervailing interest (for instance, an intellectual property argument) in denying data access should rest on the employer.


Genuine business reasons might require the processing of data from gig-workers, such as fraud prevention and increasing productivity.[155] However, privacy expectations are contextually different in the gig-economy, where the power relations are acutely skewed in companies’ favour, and continuous invasive data collection hinders workers’ trust, autonomy and dignity. Thus, the proportionality principles in the gig-workers context should be applied more rigorously.[156] less significant categories of data should not be collected/discarded if collected, such as the speed of the vehicle, acceptance rates of customers’ requests, etc. There should also be a proscription on the utilisation of wearables and other tracking devices which has particularly proliferated during the pandemic. Having such a tailored approach can actually boost the productivity of workers, now that they are less stressed about unjustified monitoring. They can bring greater creativity and motivation to their work.[157]


The strictness of the proportionality standard could also vary based on the specific gig examined, as the levels of vulnerabilities vary across gigs based on the ownership of assets, control exercised over the work, etc.[158] Gig-work, where the worker owns more resources (for instance, an Ola driver who often owns their vehicle), would imply lesser vulnerability as compared to a counterpart who does not own resources (a cleaner from Urban Company, for example, who might own lesser resources). Italy, for instance, has a sector-specific data-protection guarantee only for gig-workers undertaking delivery services.[159]


This tentative proposal definitely requires greater fleshing out: how exactly would the lines be drawn across sectors? How to measure vulnerabilities across different sectors and then design tailored proportionality requirements for each? These are some essential questions requiring careful deliberation.


A.2. Algorithmic Accountability

To achieve fairness in algorithmic processing, transparency is the foremost value to be emphasised. A right, inspired by the GDPR and the 2021 Directive, to object to purely automated decisions having significant effects, along with safeguards such as access to the data and the process based on which decisions are made, must be legislated. This will help understand how workers are allotted work, how performances are tracked, on what grounds terminations take place, etc. Arbitrary and discriminatory outcomes can thus be diminished, by ensuring that automated decisions affecting worker rights are rectified, as stated in art 8(3) of the 2021 Directive.[160]


The 2021 Directive should also inform the de-biased usage of algorithms. Given that algorithmic processing could also be prone to bias, the law should require that algorithmic decisions should be mediated by humans.[161] Dismissals/major decisions without following PNJs or having a ‘human-in-command’ should be seen as an ‘unfair labour practice’ punishable with a fine, similar to art 8(1) of the 2021 Directive. Besides, there should be scope for challenging the decision by placing the worker’s case before a neutral party, like a Tribunal. Challenges to decisions might involve interrogating the data points based on which the algorithm decides to terminate and the presence of biases in the data examined (such as in customer ratings, etc.).[162]


If a decision is substantially algorithm-induced, the burden of discharging fairness of automated-decision making should be on the company. Failing this, the decision would be reversed, or the algorithm modified, as the Bologna Labour Court ordered Deliveroo to do.[163] This will help bridge the subjective experiences the workers face at work with the objective conclusions drawn by algorithms solely on surveilled data, and thus bring greater autonomy to workers.[164] Lyft entered into an agreement with a gig-workers’ union to provide for footing the expenses of arbitration for complaints instituted by drivers regarding deactivation of their ‘driver’ status inter alia,[165] which is also a good practice to adopt. The MVA Guidelines should be transposed into this sector-specific law along with these safeguards highlighted above. This will bring greater worker dignity and uphold intellectual, bodily, behavioural and associational privacy, as workers will no longer be psychologically and physically pressured while performing their duties.


A.3. Safeguards to promote collectivisation

A major problem with prevalent surveillance of workers was how they reduced them to atomised workers without negotiating power. The gig-economy’s transformation in how work is performed also affects the scope for unionisation, due to the difficulties in organising dispersed gig-workers via digital means, the piecemeal nature of the work which reduces the scope for unionisation,[166] and the constant surveillance of gig-workers, which has previously tried to identify and restrict unionisation.


Under current jurisprudence, there is a need to prove a relationship of employment to register as a trade union under the Trade Unions Act.[167] This condition is difficult for gig-workers to satisfy. In the Industrial Relations Code, ‘worker’ for the Chapter on Trade Unions includes persons ‘employed’ in trade/industry and a ‘worker’ under the UWSSA.[168] Since gig-workers might not satisfy both requirements, the scope for them to unionise and demand better privacy and algorithmic fairness might be reduced. The UK decision categorising certain gig-workers as employees have accelerated demands for the unionisation of gig-workers in India, but a legal entitlement seems yet in the pipeline.[169]


Hence, a dignitarian conception of privacy for gig-workers must increase the scope for collectivisation. Collectivisation would increase information flows and ensure greater negotiating power which would help protect data rights.[170] An important area where unions would help in the gig economy is in managing the data processed from workers.[171] A ‘data stewardship’ model managed by a union should be provided for in the law to represent gig workers’ interests in determining what data can and cannot be shared, and in understanding how it is processed by the algorithms – for instance, how data is processed while generating worker profiles, how it is used to fire workers, etc.[172]


This approach signifies a move from individualistic data protection where individuals are assumed to be rational and informed, to community management of data by a trust-like association working in the best interests of gig-workers.[173] The stewardship model can ensure transparency and accountability in the kinds of data collected and how they are processed by algorithms.[174] Such an entitlement can help generate industry standards for each kind of gig-work, which would be backed and enforced by the State.[175]


Inspired by the collective bargaining agreements described above, such unions must also be provided scope to delineate the proportionality standards, such as having a ‘burnout’ break of a certain number of days per month where they aren’t tracked or policed rigorously, for instance. They could also go beyond the entitlements the law would grant –for instance, a ‘new technology agreement[176] could ensure that the unions are consulted before any new measure is introduced, as in Norway’s example and the 2021 Directive. This would ensure that even new kinds of data collected, or new algorithmic processing techniques, should all be run by the negotiating union before being implemented. Involving the gig-union in design aspects would serve as a consultative[177] and preventive, rather than a post-hoc, remedy.


Admittedly, the tentative suggestions made here are subject to their viability in each gig-sector, but the argument is that an entitlement to collectivisation helps increase bargaining power, reduce information asymmetries, and simultaneously provides flexibility in shaping entitlements according to the area in focus.


B. The role of social security


We noted the absence of entitlements to social security for gig-workers in current and upcoming legislation. However, guaranteeing adequate security through such laws indirectly helps protect workers’ privacy rights too – as mentioned above, labour and privacy laws work in tandem, and either can help secure the goals that both seek to promote.


For instance, when workers are conferred with greater protections, like improved bargaining power, through labour law, it results in increased security. Workers in situations of economic vulnerability are coerced to consent to disproportionate surveillance, as was especially seen during the pandemic.[178] Contrastingly, workers with security have greater bargaining power and have resisted workplace surveillance in the past.[179] This is because the fear of consequences, such as discrimination or termination, can constrain workers from expressing their freedom. When collectivised, they can then resist, instead of caving into such surveillance practices being imposed in a dictatorial manner upon them.


The ILO’s Social Protection Floors Recommendation, 2012, has also emphasised on statutory social security for everyone and linked it to the need to preserve people’s dignity and to empower them in light of labour market changes.[180] Promoting dignity-based and empowerment goals in turn achieves what privacy guarantees seek to provide, and thus social security entitlements can help demand privacy rights with greater autonomy.


Conclusion


Through this paper, we attempted to identify the specific privacy harms, including behavioural and associational privacy, caused by intrusive data collection from gig workers and their arbitrary deployment in algorithmic processing. Having observed the inadequate legislative framework in India’s labour and privacy laws in protecting against such rights’ abuses, we surveyed international jurisprudence – namely, ILO’s Code on Protection of Workers’ Data, the GDPR and the cases under it, and collective bargaining agreements. The requirements of proportionality in data processing, rights preventing opaque and purely automated algorithmic processing, and the importance of collectivisation were stressed, in these instruments.


Hence, we utilised this to recommend a legislative framework specifically for the gig-sector in India, emphasizing a right to privacy and strict proportionality, algorithmic fairness with a right to human-mediated decisions, and entitlements to collectivisation in the form of community stewardship of data. We also briefly examined how an entitlement to social security can indirectly achieve the same goals the framework proposed would achieve.

Technological advances are only further going to blur the distinction between the personal and professional, and labour law will have to adapt to such upheavals of the terrain on which it is premised. Consequently, labour and privacy law, which uphold common values, must work together to empower and uphold the dignity of gig-workers.

[1] Vili Lehdonvirta, ‘Where are online workers located? The international division of digital gig work’ (Oxford Internet Institute, 11 July 2017) <https://www.oii.ox.ac.uk/news-events/news/where-are-online-workers-located-the-international-division-of-digital-gig-work/> accessed 9 December 2021. [2] Saloni Atal, ‘Towards a Gender Equal Future of Work for Women: A Preliminary Case Study of Women in the Gig Economy in India During COVID-19’ (2020) Tandem Research Issue Brief 5 <https://tandemresearch.org/assets/Women-Platform-TR-2020-5.pdf> accessed 9 December 2021. [3] Judith A. Chevalier, ‘Gig Workers Value Their Flexibility…a Lot’ (Yale Insights, 16 April 2019) <https://insights.som.yale.edu/insights/gig-workers-value-their-flexibility-lot> accessed 4 July 2022. [4] Surveillance capitalism is an exploitative economic activity that profits from the processing of behavioural data, which instrumentalises the humans whose data it collects, for profit-oriented ends. See Shoshana Zuboff, The Age of Surveillance Capitalism (Profile Books 2019). [5] ‘Need for Surveillance Reform Stronger Than Ever in Light of the Draft Data Protection Bill, 2021’ (Internet Freedom Foundation, 21 December 2021) <https://internetfreedom.in/surveillance-reform-pdpb/> accessed 10 November 2022; Anamika Kundu and Digvijay Chaudhary, ‘CCTVs in Public Spaces and the Data Protection Bill, 2021’ (Centre for Internet and Society, 20 April 2022) <https://cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021> accessed 10 November 2022; Apar Gupta and Vrinda Bhandari, ‘National Security, at the Cost of Citizens’ Privacy’ (The Indian Express, 20 December 2021) <https://indianexpress.com/article/opinion/columns/national-security-at-the-cost-of-citizens-privacy-7680787/> accessed 10 November 2022. [6] Bert-Jaap Koops, Bryce Newell, Tjerk Timan, Ivan Škorvánek et al, ‘A Typology of Privacy’ (2017) 38(2) Univ of Penn J. of Intl L 483, 564-569. [7] ibid 500, 501. [8] ‘Partner Terms’, ¶5C, (Dunzo) <https://www.dunzo.com/terms#partner_terms> accessed 21 November 2021. [9] Anweshaa Ghosh, ‘Women Workers in the Gig Economy in India: An Exploratory Study’ (2020) Institute of Social Studies Trust <https://www.isstindia.org/publications/1619503999_pub_GIG_Report_Final_-_Low_Res_compressed.pdf> accessed 9 December 2021. [10] Pallavi Bansal, ‘Platform drivers: From algorithmizing humans to humanizing algorithms’ (Femlab, 2October 2019) <https://femlab.co/2020/10/02/platform-drivers-from-algorithmizing-humans-to-humanizing-algorithms/> accessed 9 December 2021. [11] Shiv Sunny, ‘Meet Delivery Bhoy: The man shaking up India’s booming gig economy’ (Hindustan Times, 2 September 2021) <https://www.hindustantimes.com/analysis/meet-delivery-bhoy-the-man-shaking-up-india-s-booming-gig-economy-101630561630447.html> accessed 9 December 2021. [12] Nayantara Ranganathan, ‘Caution! Women at Work: Surveillance in Garments Factories’ (Gendering Surveillance, February 2017) <https://genderingsurveillance.internetdemocracy.in/cctv/> accessed 9 December 2021. [13] ‘#PrivacyOfThePeople: Gig and app-based workers’ (Internet Freedom Foundation, 20 September 2021) <https://internetfreedom.in/privacyofthepeople-gig-and-app-based-workers/> accessed 9 December 2021. [14] Alex Rosenblat and Luke Stark, ‘Uber’s Drivers: Information Asymmetries and Control in Dynamic Work’ (Centre for European Policy Studies, Brussels November 2015); ‘Case Study: The Gig Economy and Exploitation’ (Privacy International, 30 August 2017) <https://privacyinternational.org/case-study/751/case-study-gig-economy-and-exploitation> accessed 9 December 2021. [15] K Mohamed Sheriff, ‘Big Data Revolution: Is It a Business Disruption?’ in Lotfi Tadj and Ajay K. Garg (eds), In Emerging Challenges in Business, Optimization, Technology, and Industry (Springer International Publishing 2018). [16] ‘World Employment and Social Outlook: The role of digital labour platforms in transforming the world of work’ (2021) ILO Flagship Report <https://www.ilo.org/wcmsp5/groups/public/---dgreports/---dcomm/---publ/documents/publication/wcms_771749.pdf> accessed 9 December 2021 (‘ILO Report’); ‘Uber Turns Passive Data Into Active Earnings’ (Tech Portfolio, 26 October 2016) <http://techportfolio.net/2016/12/uber-turns-passive-data-into-active-earnings/#ixzz7EBo4y9yy> accessed 9 December 2021. [17] Bama Athreya, ‘Slaves to Technology: Worker control in the surveillance economy’ (2020) 15 Anti-Trafficking Review <https://www.antitraffickingreview.org/index.php/atrjournal/article/view/490> accessed 9 December 2021. [18] Rosenblat and Stark (n 14). [19] ILO Report (n 16). [20] N. A. Moreham, ‘Beyond Information: Physical Privacy in English Law’ (2014) 73(2) Cambridge LJ 350. [21] Eleni Frantziou. ‘The right to privacy while working from home (‘WFH’): why employee monitoring infringes Art 8 ECHR’ (UK Labour Law, 5 October 2020) <https://uklabourlawblog.com/2020/10/05/the-right-to-privacy-while-working-from-home-wfh-why-employee-monitoring-infringes-art-8-echr-by-eleni-frantziou/> accessed 9 December 2021. [22] Jeevan Hariharan and Hadassa Noord, ‘Employee Monitoring as a Form of Imprisonment’ (UK Labour Law Blog, 19 May 2021) <https://uklabourlawblog.com/2021/05/19/employee-monitoring-as-a-form-of-imprisonment-jeevan-hariharan-and-hadassa-noorda/> accessed 9 December 2021. [23] Julie Cohen, ‘What Privacy is For’ (2013) 126(7) Harv L Rev 1904 <https://harvardlawreview.org/2013/05/what-privacy-is-for/> accessed 9 December 2021. [24] Internet Freedom Foundation (n 13). [25] Funda Ustek-Spilda, Alessio Bertolini, et al, ‘COVID-19, the gig economy and the hunger for surveillance’ (Ada Lovelace Institute, 8 December 2020) <https://www.adalovelaceinstitute.org/blog/covid-19-gig-economy-hunger-for-surveillance/> accessed 9 December 2021. [26] NDTV, ‘Zomato, Swiggy, Urban Company add vaccination certificate to gain trust’ (Twitter, 9 June 2021) <https://twitter.com/ndtv/status/1402689925980459008?lang=en> accessed 9 December 2021. [27] Aroon Deep, ‘Gig economy workers’ collective questions use of Aarogya Setu’ (Medianama, 12 June 2020) <https://www.medianama.com/2020/06/223-ifat-aarogya-setu-gig-economy/ > accessed 9 December 2021. [28] Aditi Agrawal, ‘Gig Economy Workers’ Collective Questions Use of Aarogya Setu’ (Medianama, 12 June 2020) <https://www.medianama.com/2020/06/223-ifat-aarogya-setu-gig-economy/> accessed 25 November 2021. [29] Ibid. [30] ‘Locking Down the Impact of COVID-19’ (2020) Center for Internet and Society Report <https://cis-india.org/raw/ifat-itf-locking-down-the-impact-of-covid-19-report> accessed 9 December 2021. [31] Deep (n 27). [32] Mareike Möhlmann and Lior Zalmanson, ‘Hands on the wheel: Navigating algorithmic management and Uber drivers’ autonomy’ (38th International Conference on Information Systems, Seoul, 2017) <https://www.semanticscholar.org/paper/Hands-on-the-Wheel%3A-Navigating-Algorithmic-and-Uber-Moehlmann-Zalmanson/70aa7c8eaacf7bf802ec211123281364cbaf528d> accessed 25 November 2021. [33] Bansal (n 10). [34] Mareike Möhlmann & Ola Henfridsson, ‘What People Hate About Being Managed by Algorithms, According to a Study of Uber Drivers’ (Harvard Business Review, 30 August 2019) <https://hbr.org/2019/08/what-people-hate-about-being-managed-by-algorithms-according-to-a-study-of-uber-drivers> accessed 9 December 2021. [35] Ibid. [36] Hatim Rahman, ‘From Iron Cages to Invisible Cages: Algorithmic Evaluations in Online Labour Markets’ (2018) Stanford University Working Paper <https://journals.sagepub.com/doi/abs/10.1177/00018392211010118> accessed 9 December 2021; ILO Report (n 16). [37] ILO Report (n 16). [38] Noopur Raval, ‘Automating informality: On AI and labour in the global South’ (Global Information Society Watch, 2019) <https://giswatch.org/node/6202> accessed 9 December 2021. [39] Ghosh (n 9). [40] ILO Report (n 16). [41] Koops (n 6) 567. [42] Wanda J. Orlikowski and Susan V. Scott, ‘What happens when evaluation goes online? Exploring apparatuses of valuation in the travel sector’ (2014) 25(3) Organization Science 868-891. [43] Alex Rosenblat, Karen E.C. Levy, Solon Barocas, et al, ‘Discriminating Tastes: Uber's Customer Ratings as Vehicles for Workplace Discrimination’ (2017) 9(3) Policy & Internet 256-279. [44] Shreya Raman and Rizvi Saif, ‘Gig Jobs Give Women Higher Incomes But Little Security’ (IndiaSpend, 11 January 2021) <https://www.indiaspend.com/women-2/gig-jobs-give-women-higher-incomes-but-little-security-711758> accessed 9 December 2021. [45] Athreya (n 17). [46] Rosenblat and Stark (n 14). [47] Karen Levy & Solon Barocas ‘Designing against discrimination in online markets’ (2017) 32 Berkeley Tech LJ 1184. [48] Alex Rosenblat and Luke Stark, ‘Algorithmic labor and information asymmetries: A case study of Uber’s drivers’ (2016) 10 International Journal of Communication 3758-3784. [49] Danielle Keats Citron and Frank Pasquale, ‘The Scored Society: Due Process for Automated Predictions’ (2014) 89 Washington Law Review <https://digitalcommons.law.uw.edu/wlr/vol89/iss1/2/> accessed 9 December 2021. [50] Soumyarendra Barik, ‘When algorithms dictate your work: Life as a food delivery ‘partner’’ (Entrackr, 20 August 2021) <https://entrackr.com/2021/08/zomato-when-algorithms-dictate-your-work-life-as-a-food-delivery-partner/> accessed 9 December 2021. [51] Alexandra Mateescu and Aiha Nguyen, ‘Workplace Monitoring and Surveillance’ (2019) Data and Society Report <https://datasociety.net/wp-content/uploads/2019/02/DS_Workplace_Monitoring_Surveillance_Explainer.pdf> accessed 9 December 2021. [52] Rahman (n 36). [53] Valerio De Stefano, ‘“Negotiating the algorithm”: Automation, artificial intelligence and labour protection’ (2018) International Labour Office Employment Working Paper No. 246 <https://www.ilo.org/wcmsp5/groups/public/---ed_emp/--mp_policy/documents/publication/wcms_634157.pdf> accessed 9 December 2021. [54] Laurie Clarke, ‘Algorithmic bosses are moving beyond the gig economy’ (TechMonitor, 19 May 2021) <https://techmonitor.ai/leadership/workforce/algorithmic-bosses-changing-work> accessed 9 December 2021. [55] Alex Rosenblat and Luke Stark, ‘Uber’s Drivers: Information Asymmetries and Control in Dynamic Work’ Workshop Paper prepared for the Winter School “Labour in the ondemand economy” at the Centre for European Policy Studies (CEPS) in Brussels, Belgium November 23-25, 2015 [56] Kirstie Ball, ‘Workplace surveillance: an overview’ (2010) 51(1) Labor History 87. [57] Athreya (n 17). [58] Gemma Newlands, ‘Algorithmic Surveillance in the Gig Economy: The Organization of Work through Lefebvrian Conceived Space’ (2020) 42(5) Organisation Studies 719. [59] Ghosh (n 9). [60] Guy Davidov, ‘The (Changing) Idea of Labour Law’ (2007)146 Int'l Lab. Rev. 311, 312. [61] Ibid 313-316. [62] Woodrow Hartzog and Neil Richards, ‘Taking Trust Seriously in Privacy Law’ (2016) Stanford Technology Law Review 431, 447-456. [63] Woodrow Hartzog, Privacy’s Blueprint: The Battle to Control the Design of New Technologies (Harvard University Press 2018) 104-106. [64] Artur Rycak, ‘New technologies and the employee’s right to privacy’ in Jo Carby-Hall and Lourdes Mella Méndez (eds), Labour Law and the Gig Economy: Challenges Posed by the Digitalisation of Labour Processes (Routledge 2020) 171, 173. [65] KS Puttaswamy (Aadhaar-5J) v. Union of India (2019) 1 SCC 1 [511.9]. [66] ‘Protection of Workers’ Personal Data’ (International Labour Organisation, 1997) 8 <https://www.ilo.org/global/topics/safety-and-health-at-work/normative-instruments/code-of-practice/WCMS_107797/lang--en/index.htm> accessed 22 November 2021 (‘ILO Code of Practice’). [67] Robert Booth, ‘Uber drivers to launch legal bid to uncover app's algorithm’ (The Guardian, 20 July 2020) <https://www.theguardian.com/technology/2020/jul/20/uber-drivers-to-launch-legal-bid-to-uncover-apps-algorithm> accessed 7 December 2021. [68] Rakhi Jindal et al, ‘The Indian legal position on employee data protection and employee privacy’ (NDA, March 2012) 47. [69] Ibid. [70] Employees’ Compensation Act, 1923, s 2(dd); Employees’ State Insurance Act, 1948, ss 1(4) and 1(5). [71] In fact, a representative association of app-based delivery and transport workers has filed a writ before the Supreme Court seeking to categorise them as ‘unorganised workers’ under this Act and thus provide them social security benefits. ‘Gig Workers’ Access to Social Security: The Indian Federation Of App Based Transport Workers (IFAT) v. Union of India’ (Supreme Court Observer, 1 December 2021) <https://www.scobserver.in/cases/gig-workers-access-to-social-security-the-indian-federation-of-app-based-transport-workers-ifat-v-union-of-india/> accessed 5 December 2021. [72] Code on Social Security, 2020, s 113(1)(b). [73] Ibid s 114. [74] ‘Rework social security code for informal workers’ (The Hindu Business Line, 19 May 2021) <https://www.thehindubusinessline.com/opinion/rework-social-security-code-for-informal-workers/article34599616.ece> accessed 3 December 2021. [75] Arghya Sengupta, ‘The Data Protection Bill, 2021: It’s No Longer Personal’ (Vidhi Centre for Legal Policy, 22 December 2021) <https://vidhilegalpolicy.in/blog/the-data-protection-bill-2021-its-no-longer-personal/>accessed 20 February 2022. [76] The Information Technology Act, 2000, s 72A. [77] Ibid s 43A; Jindal (n 68) 48. [78] The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, rule 3. [79] Ibid rules 4 and 5. [80] Ibid rule 5(6) and 5(7). [81] Satish N. v. State of Karnataka (2016) SCC OnLine Kar 6542 [226-228]. [82] Neera Mathur (Mrs) v. LIC (1992) 1 SCC 286. [83] Surabhi Agarwal, ‘Fresh Legislation may replace Data Protection Bill’ (Economic Times, 17 February 2022) <https://economictimes.indiatimes.com/tech/technology/fresh-legislation-may-replace-data-protection-bill/articleshow/89624369.cms> accessed 22 February 2022. [84] Data Protection Bill 2021, clause 3(28). [85] Ibid clauses 5-10. [86] Ibid clause 13(1)(a). [87] Internet Freedom Foundation (n 13). [88] ‘Report of the Joint Committee on the Personal Data Protection Bill, 2019’ (Lok Sabha Secretariat, December 2021) 66 <https://drive.google.com/file/d/1emcAB8HjE2oCC_DI6zR5YPnPQ5iwwwCT/view?usp=sharing> accessed 15 January 2022 (‘JPC Report’). [89] Data Protection Bill 2021, clauses 14(1) and 14(2). [90] Internet Freedom Foundation (n 13). [91] Data Protection Bill 2021, clause 5. [92] Ibid clause 17 and 19. [93] Ibid clause 18(3). [94] Ibid clause 21(4). [95] Ibid clause 3(28). [96] JPC Report (n 88) 25-26. [97] ‘Unconstitutional draft report on non-personal data ignores concerns about privacy and data monopolies’ (Internet Freedom Foundation, 18 January 2021) <https://internetfreedom.in/unconstitutional-draft-report-on-non-personal-data-ignores-concerns-about-privacy-and-data-monopolies/> accessed 1 December 2021. [98] ‘The non-personal data policy process remains opaque and problematic #SaveOurPrivacy’ (Internet Freedom Foundation, 23 December 2020) <https://internetfreedom.in/the-non-personal-data-policy-process-remains-opaque-and-problematic-saveourprivacy/> accessed 1 December 2021. [99] ‘Draft Report by the Committee of Experts on Non-Personal Data Governance Framework: Version 2’ (Committee of Experts, 16 December 2020) 47 <https://static.mygov.in/rest/s3fs-public/mygov_160975438978977151.pdf> accessed 20 February 2022. [100] Rakesh Kumar, ‘India needs to bring an algorithm transparency bill to combat bias’ (Observer Research Foundation, 9 September 2019) <https://www.orfonline.org/expert-speak/india-needs-to-bring-an-algorithm-transparency-bill-to-combat-bias-55253/> accessed 30 November 2021. [101] ‘Motor Vehicle Aggregator Guidelines issued to regulate shared mobility and reducing traffic congestion and pollution’ (Press Information Bureau, 27 November 2020) <https://www.pib.gov.in/Pressreleaseshare.aspx?PRID=1676403> accessed 20 February 2022. [102] The Motor Vehicle Aggregators Guidelines 2020, clause 9(6). See also Nandini Chami and Sadhana Sanjay, ‘A data rights agenda for platform and gig economy workers’ Hindustan Times (23 December 2020) <https://www.hindustantimes.com/analysis/a-data-rights-agenda-for-platform-and-gig-economy-workers/story-9ZtpX9kavVSN5TNQcE289M.html> accessed 28 November 2021. [103] Industrial Disputes Act 1947, s 25T; The Industrial Relations Code, 2020, s 84 (‘IR Code’). [104] Ibid s 2(ra), IR Code, s 2(zo). [105] Ibid Fifth Schedule, entry 5(a); IR Code, Second Schedule, entry 5(f). [106] Satish Ganesh Saphtarshi & Ors. v. Kirloskar Oil Engines Ltd. & Anr. (1995) SCC OnLine Bom 20 [17-18]. [107] ILO Code of Practice (n 66). [108] Ibid 18. [109] Ibid 14. [110] General Data Protection Regulation [2016] OJ L119/1, art 88 (‘GDPR’). [111] Patrick Eecke and Anrijs Simkus, ‘Article 88: Processing in the context of employment’ in Christopher Kuner et al (eds), The EU General Data Protection Regulation: A Commentary (OUP 2020) 1235-1236. [112] GDPR, art 22. [113] Ibid art 22(3) and recital 71. [114] ‘Dutch & UK courts order Uber to reinstate ‘robo-fired’ drivers’ (Worker Information Exchange, 14 April 2021) <https://www.workerinfoexchange.org/post/dutch-uk-courts-order-uber-to-reinstate-robo-fired-drivers> accessed 3 December 2021. [115] Applicant 1 & Ors. v. Uber BV, Case no. C/13/696010/HA.ZA.21-81, dated 19 February 2021 [3.1-3.4]. [116] Applicant 1 & Ors. v. Ola Netherlands BV, Case No. C/13/689705/HA.RK.20-258, dated 11 March 2021 [4.23-4.25] (‘Ola’). [117] In accordance with GDPR, art 15(1)(h). [118] Ola (n 116) [4.45]. [119] Ibid [4.47]. [120] Ibid [4.49]. [121] Applicant 1 & Ors. v. Uber BV, Case No. C/13/687315/HA.RK 20-207, dated 11 March 2021 (‘Uber1’). [122] Ibid [4.25]; ‘Gig workers score historic digital rights victory against Uber and Ola Cabs’ (Worker Information Exchange, 16 March 2021) <https://www.workerinfoexchange.org/post/gig-workers-score-historic-digital-rights-victory-against-uber-ola-2> accessed 3 December 2021. [123] Ibid [4.53-4.55] [124] Ibid [4.38, 4.41]. [125] Ibid [4.45-4.51]. [126] Applicant 1 & Ors v. Uber BV, Case No. C/13/692003/HA.RK 20-302, dated 11 March 2021 [4.24] (‘Uber2’). [127] Ibid [4.67-4.68]. [128] GDPR, art 15. [129] Commission, ‘Proposal for a Directive of the European Parliament and of the Council on improving working Conditions in Platform Work’ (Proposal) COM(2021) 762 final, 9 December 2021 [‘2021 Directive’]. [130] 2021 Directive, arts 4(1) and 4(2). [131] 2021 Directive, arts 6(1) and 6(2). [132] 2021 Directive, art 6(4). [133] 2021 Directive, art 6(5). [134] 2021 Directive, arts 7(1), 7(2) and 7(3). [135] 2021 Directive, art 8(1). [136] 2021 Directive, art 8(2). [137] 2021 Directive, art 9(1). [138] Phoebe Moore et al, ‘Digitalisation of Work and Resistance’, in Martin Upchurch et al (eds), Humans and Machines at Work (Palgrave Macmillan 2018) 17, 33-34. [139] Ibid. [140] ‘Collective agreement Between Hilfr ApS. and 3F Private Service, Hotel and Restaurant’ (Danish Confederation of Trade Unions, 2018) < https://old.adapt.it/adapt-indice-a-z/wp-content/uploads/2020/10/Hilfr-3F-collective-agreement-2018.pdf > accessed 7 December 2021. [141] Janine Berg and Valerio Stefano, ‘How are workers faring in the gig economy?’(ILO, 20 May 2016) <https://iloblog.org/2016/05/20/how-do-workers-fare-in-the-gig-economy/> accessed 4 December 2021. [142] Ifeoma Ajunwa et al, ‘Limitless Worker Surveillance’ (2016) 105 California Law Review 735, 762. [143] Joel Reidenberg, Jaspreet Bhatia, Travis Breaux, and Thomas Norton, ‘Ambiguity in Privacy Policies and the Impact of Regulation’ (2016) 45(2) JLS <https://www.journals.uchicago.edu/doi/abs/10.1086/688669?journalCode=jls> accessed 18 July 2022. [144] Rahul Matthan, ‘Beyond Consent – A New Paradigm for Data Protection’ (2017) Takshashila Discussion Document 3/2017, 2-3 <https://static1.squarespace.com/static/618a55c4cb03246776b68559/t/62a6beb7e9a41c52669d188b/1655094971332/TDD-Beyond-Consent-Data-Protection-RM-2017-03.pdf > accessed 18 July 2022; Joel Reidenberg, Stanley D, et al, ‘Privacy Harms and the Effectiveness of Notice and Choice Consent Framework’ (2014) 11(2) Journal of Law and Policy for the Information Society 485, 490-496. [145] Ibid. [146] ILO Code of Practice (n 68) 8. [147] Ibid. [148] Divij Joshi, ‘Privacy Theory 101: Privacy as Contextual Integrity’ (CLPR, 17 September 2020) <https://clpr.org.in/blog/privacy-theory-101-privacy-as-contextual-integrity/> accessed 2 December 2021. [149] ‘Labour Law Must Recognise Platform Workers' Rights’ (IT For Change, September 2020) <https://itforchange.net/labour-law-platform-workers-rights-data-digital-economy> accessed 13 November 2021 (‘ITFC’). [150] ‘Delivery Partner Terms and Conditions’ (Runnr) <https://www.runnr.in/delivery-partner-tandc.html> accessed 28 November 2021. [151] Sam Adler-Bell and Michelle Miller, ‘The Datafication of Employment’ (The Century Foundation, 19 December 2018) 15 <https://tcf.org/content/report/datafication-employment-surveillance-capitalism-shaping-workers-futures-without-knowledge/?agreed=1> accessed 2 December 2021 (‘Datafication’); Tatum Millet, ‘Privacy or Paycheck: Protecting Workers Against Surveillance’ (Digital Freedom Fund, 3 August 2020) <https://digitalfreedomfund.org/privacy-or-paycheck-protecting-workers-against-surveillance/> accessed 28 November 2021. [152] Julia Carrie, ‘Uber's secret Hell program violated drivers' privacy, class-action suit claims’ (The Guardian, 25 April 2017) <https://www.theguardian.com/technology/2017/apr/24/uber-hell-program-driver-privacy-lyft-spying> accessed 27 November 2021. [153] ITFC (n 149). [154] Wilfred Chan, ‘The Workers Who Sued Uber and Won’ (Dissent, 5 May 2021) <https://www.dissentmagazine.org/online_articles/the-workers-who-sued-uber-and-won> accessed 2 December 2021. [155] Ball (n 56) 88. [156] De Stefano (n 53) 8-9. [157] Ethan Bernstein, ‘The Transparency Trap’ (Harvard Business Review, October 2014) <https://hbr.org/2014/10/the-transparency-trap> accessed 3 December 2021l; Ball (n 56) 93-94. [158] Katherine C. Kellogg et al, ‘Algorithms at Work: The New Contested Terrain of Control’ (2020) 14(1) Academy of Management Annals 366, 383. [159] CMS Legal, ‘Gig working, platform companies and the future: a global perspective from CMS Employment Lawyers in 15 countries’ (Lexology, 2021) <https://www.lexology.com/library/detail.aspx?g=8f9530c3-067d-42de-8660-0efd93f97add> accessed 28 November 2021. [160] Adler-Bell and Miller (n 151) 16. [161] Philippa Collins, ‘Automated Dismissal Decisions, Data Protection and The Law of Unfair Dismissal’ (The UK Labour Law blog, 19 October 2021) <https://uklabourlawblog.com/2021/10/19/automated-dismissal-decisions-data-protection-and-the-law-of-unfair-dismissal-by-philippa-collins/> accessed 14 November 2021.: [162] Ibid. [163] ‘Italy: Bologna Labour Court held a previously used algorithm of a platform company as discriminatory’ (Industrial Relations and Labour Law, February 2021) <https://ioewec.newsletter.ioe-emp.org/industrial-relations-and-labour-law-february-2021/news/article/italy-bologna-labour-court-held-a-previously-used-algorithm-of-a-platform-company-as-discriminatory> accessed 3 December 2021. [164] Newlands (n 58) 720-725. [165] Antonio Aloisi, Commoditized Workers: Case Study Research On Labor Law Issues Arising From A Set Of “On-Demand/Gig Economy” Platforms (2016) 37 Comp. Labour Law and Policy Journal 653, 685. [166] Zoe Tabary and Avi Asher-Schapiro, ‘Tech experts voice concerns of gig worker surveillance in pandemic’ (Thomson Reuters, 11 November 2020) <https://news.trust.org/item/20201111184721-5352k> accessed 28 November 2021. [167] Tirumala Tirupati Devasthanam v. Commissioner of Labour & Ors. 1995 Supp (3) SCC 653. [168] IR Code, 2020, proviso to s 2(zr). [169] Soumya Jha and Ulka Bhattacharya, ‘How trade unions recharged for the gig economy’(The Hindu Business Line, 28 October 2021) <https://www.thehindubusinessline.com/opinion/how-trade-unions-recharged-for-the-gig-economy/article37210395.ece> accessed 6 December 2021. [170] Sanjana Varghese, ‘Gig economy workers have a new weapon in the fight against Uber’ (Wired, 17 February 2020) <https://www.wired.co.uk/article/gig-economy-uber-unions> accessed 28 November 2021. [171] S.P. Choudary, ‘The Architecture of Digital Labour Platforms: Policy Recommendations on Platform Design for Worker Well-being’ (2018) ILO Future of Work Research Paper Series, No. 3. 31-32 <https://www.ilo.org/wcmsp5/groups/public/---dgreports/---cabinet/documents/publication/wcms_630603.pdf> accessed 1 December 2021. [172] Astha Kapoor, ‘Collective bargaining on digital platforms and data stewardship’ (February 2021) 6 <http://library.fes.de/pdf-files/bueros/singapur/17381.pdf> accessed 27 November 2021. [173] Athreya (n 17) 95. [174] ILO Code of Practice (n 68) 24. [175] Josh Eidelson and Benjamin Penn, ‘Labor, Gig Companies Near Bargaining Deal in N.Y.’ (Bloomberg, 18 May 2021) <https://www.bloomberg.com/news/articles/2021-05-18/labor-gig-companies-are-said-to-be-near-bargaining-deal-in-n-y> accessed 6 December 2021. [176] Clarke (n 54). [177] Stefano (n 53) 95. [178] Titiksha Vashist and Shyam Krishnakumar, ‘COVID-19 and the New Normal in India’s Gig Economy’ (Datactive, 19 March 2021) <https://data-activism.net/2021/03/bigdatasur-covid-covid-19-and-the-new-normal-in-indias-gig-economy/> accessed 30 November 2021. [179] Ben Quinn and Jasper Jackson, ‘Daily Telegraph to withdraw devices monitoring time at desk after criticism’ (The Guardian, 11 January 2016) <https://www.theguardian.com/media/2016/jan/11/daily-telegraph-to-withdraw-devices-monitoring-time-at-desk-after-criticism> accessed 5 December 2021, [180] Janine Berg, ‘Precarious workers pushed to the edge by COVID-19’ (ILO, 20 March 2020) <https://iloblog.org/2020/03/20/precarious-workers-pushed-to-the-edge-by-covid-19/> accessed 1 December 2021.WW

525 views

Comments


images-removebg-preview.png

NATIONAL LAW SCHOOL OF INDIA REVIEW  © 2022

bottom of page