Content area
This article critically examines how the European General Data Protection Regulation (GDPR) defines and applies the legal categories of "data subject" and "data controller" within employment relationships. Under the GDPR, employers who collect and process personal data are classified as data controllers, while employees are designated as data subjects. However, this article argues that such a "translation" of data protection categories, rights, and obligations into the regulation of workplace dynamics requires closer examination. The focus is on evaluating whether the GDPR's rights and obligations for data subjects and data controllers accurately capture the agency and interests of employee and employer roles in the workplace. Through an analysis of data subject rights (GDPR, Articles 12-22) and selected data controller obligations (GDPR, Articles 24, 25, and 35), this article identifies a "Lost in Translation" effect, where the GDPR's assumptions of autonomy and agency conflict with the hierarchical nature of employment relationships and with the information asymmetries between employers and data processors (GDPR, Article 28). Ultimately, this analysis highlights the need to reconsider and adapt these legal categories to better reflect the unique characteristics of employment in the regulation of data processing.
This article critically examines how the European General Data Protection Regulation (GDPR) defines and applies the legal categories of "data subject" and "data controller" within employment relationships. Under the GDPR, employers who collect and process personal data are classified as data controllers, while employees are designated as data subjects. However, this article argues that such a "translation" of data protection categories, rights, and obligations into the regulation of workplace dynamics requires closer examination. The focus is on evaluating whether the GDPR's rights and obligations for data subjects and data controllers accurately capture the agency and interests of employee and employer roles in the workplace. Through an analysis of data subject rights (GDPR, Articles 12-22) and selected data controller obligations (GDPR, Articles 24, 25, and 35), this article identifies a "Lost in Translation" effect, where the GDPR's assumptions of autonomy and agency conflict with the hierarchical nature of employment relationships and with the information asymmetries between employers and data processors (GDPR, Article 28). Ultimately, this analysis highlights the need to reconsider and adapt these legal categories to better reflect the unique characteristics of employment in the regulation of data processing.
Keywords: Labour Law Protection, Data Protection, Privacy, GDPR
I. INTRODUCTION: THE LEGAL CATEGORIES OF EUROPEAN DATA PROTECTION AND LABOUR LAW.......... 554
II. EMPLOYEES AS DATA SUBJECTS.......... 557
A. Data Subjects and Their Rights.......... 558
B. Who Is the Data Subject?.......... 560
C. Lost in Translation: Consumer Protection Disguised as Labour Law Protection.............. 564
III. EMPLOYERS AS DATA CONTROLLERS.......... 567
A. Employers as Consumers of Data Processing Tools.......... 569
B. Data Controllers and Processors.......... 570
C. Lost in Translation: Employers Disguised as Creators of Data Processing.................... 572
IV. IN SEARCH OF A PROPER TRANSLATION: ADDRESSING AGENCY GAPS.......... 575
V. CONCLUSION.......... 577
VI. REFERENCES.......... 579
I. INTRODUCTION: THE LEGAL CATEGORIES OF EUROPEAN DATA PROTECTION AND LABOUR LAW
RESEARCH INTERSECTING LABOUR LAW and data protection has fascinated scholars for decades (Finkin, 1995; Simitis, 1986). New technologies bring in new organisational patterns where collecting and processing employees' personal data is integrated into the company's operation. The employer's right to monitor is carefully investigated by European courts and scholars since it ostensibly shows signs of hypertrophy: monitoring and managing employees is easier thanks to the increasing availability of digital tools and more intense, constant, and precise data collection at work (Molè & Mangan, 2023).1
When the early surveillance tools took hold in workplaces, it became apparent that the collection of data from employees, and then their processing through automated systems, had to be regulated. As early as 1987, Spiros Simitis noted that "The pressure resulting from employees' need to retain their jobs and from the presence of the information system inhibits critical reactions. The employees tend, instead, to conform to the real or assumed expectations of the employer" (Simitis, 1987, p. 723). Simitis' argument well represents the stance of the first European scholars dealing with automated data processing in work contexts: not applying data protection regulations at work would have tantamount to an exception to labour law protection, i.e., regulate employer authority, reduce contractual inequalities, and safeguard employees' agency in subordinate employment (Hendrickx, 2022, pp. 9-10).
This new regulatory challenge was first taken up in the European context in 1989 by the Council of Europe (CoE) Recommendation No. R(89)2 on the Protection of Personal Data Used for Employment Purposes. There, the CoE states that data collection at work should have been guided "by principles which are designed to minimise any risks which such methods could possibly pose for the rights and fundamental freedoms of employees, in particular their right to privacy" (Council of Europe, 1989, p. 1). To this day, however, the CoE Recommendation advocating for a framework informed by the power imbalances of the employment relationship remains a dead letter. As thoughtfully reconstructed by Halefom Abraha, the European Union (EU) from the 1990s failed in at least three instances to implement such a framework (Abraha, 2022, pp. 280-282).
Under European legislation, employers who process employees' data are consistently subject to general data protection law. Since the 1990s, this has been reflected in Data Protection Directive (DPD) 95/46/EC and, today, in EU Regulation 2016/679 (General Data Protection Regulation, hereafter GDPR). An employer who collects and processes personal data is regulated as "data controller" under the GDPR,2 whereas the employee identified by the personal data is a "data subject."3
The GDPR allows Member States to adopt more specific laws or collective agreements to protect employees' rights and freedoms.4 Additionally, the Article 29 Data Protection Working Party (Art. 29 WP) provided practical guidelines in its 2001 and 2017 opinions, interpreting relevant DPD and GDPR principles to offer meaningful protection to employees for data processing at work (Article 29 Data Protection Working Party, 2001, 2017b). However, recent scholarship continues to call for a more tailored regulatory framework (Adams-Prassl et al., 2023; Albin, 2025). The versatility of digital technologies, combined with a competitive market driving the development and sale of innovative products to employers, has resulted in increasingly intrusive data collection practices in workplaces (Negrón, 2021). Part of the literature looks beyond the existing GDPR framework, examining how the EU Artificial Intelligence (AI) Regulation5 and Platform Work Directive (PWD)6 could better address the growing concerns over employer authority in the digital age (Cristofolini, 2024; Potocka-Sionek & Aloisi, 2025).
In this article, my objective is not to discuss new strategies for regulating data protection at work. Instead, I examine the GDPR in greater detail to assess whether its apparatus of norms and rules achieves its intended goal of protection in employment relationships. I carry out a subjectivity check: what are the features and protections afforded to the data subject? What are the responsibilities and obligations for the data controller? Do these align with the roles and duties of employees and employers in employment relationships? In other words, I aim to understand whether the GDPR's regulation of the power relation between the data subject and data controller reflects that of employees and employers in employment relationships. In this sense, this article provides a critical analysis of whether and why it is necessary to move beyond this framework. It will thus not delve - primarily for reasons of space and argumentative coherence - into specific recommendations or alternative policies on how this objective should be achieved. It will rather expose the gaps and shortcomings that a general data regulation like the GDPR brings with it when applied in workplace contexts.
In Section II ("Employees as Data Subjects") and Section III ("Employers as Data Controllers"), I explore how applying the data subject-data controller framework to employees and employers, in some instances, might create a "Lost in Translation" effect. I examine the rights and obligations designed for average data subjects7 and average data controllers,8 focusing on their understanding of individual agency and accountability. More specifically, in Section II, I highlight how the GDPR's protections for average data subjects are primarily designed for identifiable individuals and consumers, and, in rare and insufficient situations, for more subordinated individuals in need of protection. In Section III, I analyse data controllers' obligations to comply with the GDPR and examine whether they enable employers to understand the functionalities and risks of the technologies they purchase from providers. Ultimately, my aim is to identify any potentially harmful "Lost in Translation" effect, where the application of the GDPR to the employment relationship may fall short of achieving its regulatory goals (Section IV, "In Search of a Proper Translation: Addressing Agency Gaps"). Finally, Section V ("Conclusion") offers a few concluding remarks.
II. EMPLOYEES AS DATA SUBJECTS
The data subject is defined in Article 4(1) of the GDPR as any
natural person [...] who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
Essentially, a data subject is any natural person who can be identified, directly or indirectly, by information available to the data controller, whose aim is to perform "operations" on such information.9 The GDPR thus regulates this power relationship by establishing legal grounds for data processing, principles governing such processing, and prohibitions.10 Consequently, it is widely accepted in legal literature that an employee, whose personal data is processed by their employer, qualifies as a data subject due to the data processing activities carried out by the employer (Abraha, 2023; Aloisi & Gramano, 2019).
However, the employee's status hinges on a peculiar type of power relation. Despite the lack of a European definition of "employment" - partially addressed by recent advancements by the PWD - subordination remains the predominant feature of the employee's status (Aloisi & De Stefano, 2024; Risak & Dullinger, 2018). The Court of Justice of the European Union (CJEU) in the Lawrie-Blum case stressed that the essential core of an employment relationship is "that for a certain period of time a person performs services for and under the direction of another person."11 Without going further into the longstanding debate on employment status, these references are useful in highlighting an element perhaps obvious yet diriment: the employee's status is defined by a condition of subordination and economic dependency. Employers exercise managerial, monitoring, and disciplinary authority over the workforce, enabling them to impose temporal and spatial discipline on employees. Labour law protection is, therefore, designed on the premise that an employer is entitled to issue rules regarding tasks and activities, breaks, and supervision (Collins, 1986; Supiot, 2015, p. 56).
This suggests that it is necessary to investigate whether the power relations on which the protections for data subjects are designed account for the element of subordination and economic dependence that characterises employment relationships. In the following subsections, I investigate the repercussions of juxtaposing data subjects' rights on employees. I aim to uncover any instances of a "Lost in Translation" effect, where the legal protections designed for data subjects may fail to align effectively with labour law protection goals.
A. DATA SUBJECTS AND THEIR RIGHTS
A brief overview of the data subject's rights12 reveals the average recipient for whom the protections were envisioned by the EU legislator, defined by Gianclaudio Malgieri as the "average data subject" (Malgieri, 2023, pp. 39-40). Articles 12-22 of the GDPR outline various rights that empower natural persons, given sufficient legal guarantees and information, to control, choose, and withdraw the information they provide to a data controller. Specifically, through these data subject rights, natural persons can fulfil their interests:
I. "To be aware of, and verify, the lawfulness of the processing" (European Data Protection Board, 2022, p. 10). The right of access enables data subjects to gain a meaningful understanding of how their personal data are being processed, including the associated consequences, and to verify the accuracy of such processing without needing to justify their intention.13 Data subjects are expected to develop awareness of how their data is used and the consequences of such use through clear and simple explanations about the data processing provided by the data controller in advance (Article 29 Data Protection Working Party, 2018, p. 16)14;
II. To correct - "rectify" - any inaccuracies regarding themselves in the data processing.15 They can further request the deletion of their data under the conditions specified in Article 17, or request that processing be restricted in accordance with Article 18. The CJEU has underscored that the effective implementation of these rights relies on the right of access16; when data subjects can enhance their awareness through access, they are better equipped to utilise Articles 16-1817;
III. To object to the processing of their data based on circumstances related to the data subject's situation, as outlined in Article 21. The data controller is required to halt the processing unless they can demonstrate a compelling interest that justifies the continued processing of the data subject's personal data (Article 29 Data Protection Working Party, 2018, pp. 18-19);
IV. To not be subjected to automated decision-making, as per Article 22. The data subject has the option to opt out of automated decision-making processes that may have legal or significant effects on them. In cases where this right is not available under specific circumstances,thedatasubjectstillretainstherighttorequesthuman intervention to express their viewpoint and contest the decision.18
This brief overview of the protections afforded by data subjects' rights illustrates the "average data subject" that the EU legislator had in mind - namely, an informed, actively interested, identified individual who wishes to determine their own representation in the personal data processed, or to stop the processing itself (Viljoen, 2021, pp. 625-626). This average actor, according to Articles 12-22 of the GDPR, would be able to make use of the information they must receive from the data controller and exercise their rights to ensure that the data controller complies with the GDPR itself and, where allowed, with the data subject's wishes.19 These general rights can, apparently, be activated by any average data subject in any context. However, in certain instances, the GDPR acknowledges conditions in which data subjects need stronger protection. For instance, Article 9 of the GDPR prohibits the processing of sensitive data such as racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, including the processing of genetic and biometric data. Member States, moreover, can introduce more protective rules for employees in the context of data processing at work.20 The Article 29 Working Party further emphasises that employers must interpret and apply the GDPR considering the power imbalance and information asymmetry experienced by employees, deeming employee consent as an invalid legal basis in most cases (Article 29 Data Protection Working Party, 2001, 2017b).
Beyond these specific stronger protections, the assumption behind data subject rights21 is that actively interested individuals can act in several ways to express their interests/wishes pertaining to data processing. Julie Cohen describes this regulatory approach as one aimed at protecting a rational person - she defines the latter the "liberal self " - who is critically independent of relational ties (Cohen, 2013, p. 1910). Is there room for "liberal self " in data processing at work? In introducing this article, I referred to the work of Simitis, who, as early as 1987, was already able to point to data processing at work ("personnel information systems") as promoting greater adjustment of employees to employers' opinion and directives, systematically inhibiting critical reactions - in other words, suggesting that there is little room for the "liberal self " in data processing at work (Simitis, 1987, p. 723).
Building on Malgieri's framework, I further explore how the "average data subject" was conceived and built as a "liberal self" in the ecosystem of the GDPR (Malgieri, 2023, pp. 1-16). This investigation will assist in determining whether the power relations on which data subject rights' are designed account for the element of subordination and economic dependence characterising the employment relationship.
B. WHO IS THE DATA SUBJECT?
Recital 7 of the GDPR highlights the importance of a strong data protection framework to build the trust essential for the digital economy's growth (while safeguarding data subjects' fundamental rights).22 The GDPR is in fact "legally justified on the basis of internal market considerations alone" (Lynskey, 2014, p. 570) - the EU has no exclusive competence on social policy and labour.23 The GDPR finds its legal ground on the exclusive competence of the EU to regulate the free movement of data in the EU internal market under Article 16(2) of the Treaty on the Functioning of the European Union (TFEU).24 Despite the focus on data subjects' fundamental rights, the EU legislator, by all means, had to design the GDPR upon an ecosystem different from a workplace.25
Interestingly, along the same lines are the objectives of EU consumer law. According to the European Data Protection Supervisor's (EDPS) view: "EU consumer protection law aims to remove barriers to the internal market by building trust in products and services throughout the internal market, on the basis of transparency and good faith" (European Data Protection Supervisor, 2014, p. 23). The EDPS further pointed out that the "EU approaches to data protection [...] and consumer protection share common goals, including the promotion of growth, innovation and the welfare of individual consumers" (European Data Protection Supervisor, 2014, p. 3). The shared intents of data and consumer protection in the EU legislative action are further reflected in the tools designed to reach those goals.26 Clear and intelligible information, notice, and individual rights to safeguard personal interest are instruments also present in consumer law to shield the "liberal self" from abuses by market stakeholders. The EU has opted for a similar strategy in regulating the European single (data) market. Information and power asymmetries also exist in consumer relations, and the impact of this imbalance may have various kinds of repercussion depending upon the search and needs of the consumer (Malgieri, 2023, pp. 27-28; Svantesson, 2018).
Highlighting this interlink is of utmost importance. As noted by Malgieri, the notion of "average consumer" in EU consumer law has had a major role in shaping the notion of "average data subject" in the GDPR.
The Unfair Commercial Practice Directive (UCPD) 2005/29/CE in Recital 18 states that "This Directive takes as a benchmark the average consumer, who is reasonably well-informed and reasonably observant and circumspect, taking into account social, cultural and linguistic factors, as interpreted by the Court of Justice."27 The European average consumer is characterised as a reasonably well-informed, observant, and circumspect individual. This definition is present in other provisions: for instance, Article 5(2)(b) UCPD, the forbidden unfair commercial practices are the ones that "materially distort [...] the economic behaviour with regard to the product of the average consumer whom it reaches or to whom it is addressed." Therefore, safeguarding the freedom of choice of the "average consumer" is paramount to prevent transactional decisions that would not have otherwise occurred. The EU Reg. 2011/1169 on the provision of food information to consumers adopts the same approach in identifying who is the consumer in the European single market.28 Despite the protections provided, the consumer is not classified as a "weak party" when it comes to shaping and pursuing their own interests. The Directive operates on the underlying assumption that once the mandatory information is provided to consumers, they will exercise due diligence in selecting and evaluating commercial offers, making informed choices (Trzaskowski, 2011).
Ultimately, the average consumer exists in EU law as a reasonably well-informed, observant, and circumspect individual. These three characteristics are specifically mirrored in the structure of the data subject rights.29 At the same time, the GDPR, in specific contexts such as sensitive data processing30 or through opening clauses for more tailored Member States' legislation in the context of employment,31 provides for stronger protections for particular data subjects. This approach is reflected in the Art. 29 WP's view on consent as an invalid legal ground in data processing at work, which recognises that consent can create an illusion of equal bargaining power, ignoring the inherent power and economic imbalances present in employment contexts (Article 29 Data Protection Working Party, 2017a). However, outside of these specific instances, the portrayal of the average data subject in the data subjects' rights32 remains that of an informed, rational, and proactive actor, actively exercising their rights. Those rights are based on the expectation that the data subject exercises them autonomously, i.e., on the expectation that "all data subjects are rational actors that will read all privacy statements and carefully weigh and balance the consequences" (Schermer et al., 2014, p. 179). This underlying principle is further affirmed by Article 80 of the GDPR ("Representation of data subjects"), which does not allow the representative of the data subjects (such as a trade union) to exercise on their behalf their data subject rights33 - unless explicitly provided for by EU Member States' domestic laws (European Data Protection Board, 2022, p. 29). Instead, Article 80 limits representatives to exercising rights provided in Articles 77-79 and 82, such as lodging complaints with the Data Protection Authority (DPA).34 The right to lodge a complaint with a DPA, logically, has no ties with the exercise of data subject rights, e.g., Articles 12-22 of the GDPR. Representatives may only intervene under Articles 77-79 and 82 once data subjects have individually exercised their rights. This dynamic preserves the fundamental premises of the GDPR and of the "average consumer": data subjects are capable individuals who engage with information policies and notices from data controllers and have the capacity/freedom to make informed choices and take action, whether independently or through representatives. In the EU legislative action, as argues Malgieri, the notion of "average consumer" has shaped the conception of the "average data subject" in terms of "mental capabilities, awareness, understanding [and designing] of rights and risks" (Malgieri, 2023, p. 27). This consumer-centric perspective reflects an underlying power relation where the primary need is to ensure transparency and informed consent, assuming that individuals can protect their interests once adequately informed. However, this portrayal overlooks the fact that not all individuals fit this idealised profile. While provisions such as Articles 9 and 88 of the GDPR explicitly recognise the existence of data subjects with different (and stronger) protection needs, this dimension is notably absent in the design of data subject rights.
Data subjects' rights outline subjects akin to rational consumers in the ecosystem of the European single market. They are outlined on similar assumptions pertaining their autonomy and agency: i.e., once reasonably informed by the data controller, they shall act rationally to fulfil their own interests. The overlapping structure of the GDPR and EU consumer protection can be then represented as follows:
Once established that the concept of "average consumer" has influenced the notion of "average data subject" under the GDPR, it is pertinent to explore if and how the latter can be juxtaposed - translated - to employees. In other words, I further discuss whether the legal protection (the data protection one) designed upon the dynamics of a consumer market ultimately undermines the enjoyment of those rights from employees.
C. LOST IN TRANSLATION: CONSUMER PROTECTION DISGUISED AS LABOUR LAW PROTECTION
The average subject according to data subjects' rights, thus, embodies a vigilant individual empowered to assert their interests regarding personal data. This portrayal assumes that a website visitor or a customer of a service, armed with adequate and transparent information, can elude potential external and deceptive pressures from advertising, enabling them to make the most informed choice for themselves (Trzaskowski, 2022). Article 4(11) and Article 7 of the GDPR, in defining consent, clearly state that what is to be protected is the "unambiguous indication of the data subject's wishes." This further illustrates the prevailing goal of empowering the "liberal self " in acting towards the realisation of their interests.
In an employment relationship, however, data subjects rely on a relationship of subordination to their employers (Moore, 2022, p. 258). The interests of data subjects in a working environment will be complemented by other factors that cannot be completely juxtaposed to the ones of a consumer: facing sanctions, termination, or resigning from employment is not akin to opting for a different seller or being misled by deceptive e-commerce information.
Although subordinate employment is based on a contractual relationship, the autonomy and agency of the employee-data subject is shaped by the employer-data controller (Moore, 2022, p. 265). The employer possesses the authority to set working conditions and unilaterally implement directives, monitoring, or disciplinary actions. Consequently, labour law recognises the limited bargaining power of employees and aims to prevent them from accepting working conditions that fall below established labour standards. Employees' agency, as pointed out by Maayan Niezna and Guy Davidov, is a matter of degree and contexts - a continuum - that might extend between "free enthusiastic choice and duress" (Niezna & Davidov, 2023, p. 1141).
Private autonomy, contractual freedoms, and values protected in labour law differ from those outlined in consumer law (Davidov, 2016, pp. 34-54). The continuum between choice and duress in employment relationships carries far-reaching implications for fundamental protections, including human dignity (De Schutter, 2013; Fenwick & Novitz, 2010), essential social inclusion (Collins, 2010, p. 22; Freedland & Kountouris, 2011, pp. 374-375), human and decent working conditions (Collins, 2022), and economic livelihood (Davidov, 2016, pp. 45-46). Unlike consumer information asymmetry, which does not systematically affect these core protections, European labour standards and domestic laws rigorously restrict the bargaining ground to reduce the span of that continuum, imposing limits, e.g., on the number of working hours that can be provided to employers (Cabrita & Böhmer, 2016), on health and safety standards in working environments (Ponce Del Castillo, 2016), on determining minimum wages and social security obligations (Ratti, 2023).
Therefore, while a consumer-data subject can be regarded as a reasonably well-informed, observant, and circumspect average person, this assumption does not hold in employment contexts. Labour law aims to re-establish power symmetry through non-waivable labour rights, counteracting market and contractual dynamics that could compromise human dignity, social inclusion, emancipation, and economic livelihood (Bogoeski, 2023; Davidov, 2020). Employees might in fact accept conditions due to economic and social coercion inherent in subordination, compounded by personal vulnerabilities such as poverty, limited choices, or debt (Mantouvalou, 2014; Niezna, 2024).
Ultimately, data subjects-employees exist in an ecosystem of contractual duties and lower bargaining power against employers, which fundamentally redefines their "liberal self " compared to the one of the "average consumer." Table 1 on the intersection of consumer and data processing relationships changes when adapted in this new scenario (Table 2):
The subordination of the data subject to the data controller, combined with the social and economic significance of employment, renders the employee a structurally "vulnerable" data subject, as defined by the Art. 29 WP (Article 29 Data Protection Working Party, 2017a, p. 11). This leads me to the broader conclusion that the agency of individuals in a relationship of subordination (employee-employer) cannot always be equated with that of individuals in a relationship of identification or consumption (data subject-data controller/ consumer-vendor). In the latter case, individuals are assumed to be proactive, reasonably well-informed, observant, and circumspect. However, such assumptions cannot universally apply to individuals in subordinate positions, where structural inequalities may limit their ability to act with the same level of autonomy and awareness.
Contrary to the GDPR's assumption, employees may be reluctant to exercise their data protection rights due to fear of potential job-related repercussions. While employees who choose not to exercise these rights remain data subjects, and despite the obligation of data controllers to ensure the effectiveness of these rights and adequately inform employees, the activation of these rights is inherently tied to the agency and autonomy of a subordinate individual.36 This ultimately weakens the GDPR's broader protection objectives. It underscores a critical disparity in the demand for protection within employment relationships, where inherent power imbalances can hinder the practical realisation and enforcement of data protection rights. Focusing on informing and empowering the "liberal self " - or, in other words, enhancing the autonomy and decision-making agency of the data subject - overlooks the social and economic issues addressed by labour regulation. Hence, treating employees as data subjects might create a "Lost in Translation" effect in the digital workplace. Labour law regulates employees as subjects in need of protection even from their own decision-making. In contrast to the consumer market, where the law refrains from interfering in undesirable prejudices or biases - adopting the view that individuals in a free society have the right to make their own decisions - the same cannot be said for employees. The labour law framework protects employees through non-waivable rules, standards, and collective agreements that limit the scope of their choices in ways that ensure basic rights and dignity. This distinction between the consumer and the labour market should lead to the adoption of different sets of protections for employees' and consumers' personal data (Albin, 2025, pp. 92-93).
III. EMPLOYERS AS DATA CONTROLLERS
The other main actor in data processing at work is the employer-data controller. The data controller, pursuant to Article 4(7), is any "natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data." The European Data Protection Board (EDPB) clarifies that the controllership is a functional concept identifying the entity that factually decides how and why data is processed (European Data Protection Board, 2020b, pp. 10-11). In its guidelines, it further stresses that being a data controller is in some instances inherently linked to the roles or activities of a given entity. The EDPB illustrates this by pointing out that established practices in labour law naturally identify employers - due to their features - as data controllers (European Data Protection Board, 2020b, p. 12). Their role usually implies the authority to determine tasks, monitor performance, and discipline employees for non-compliance within legal limits (Prassl, 2015, p. 27; Risak & Dullinger, 2018, p. 11). The employment contract in fact affords yet does not specify the intensity, quality, or pace of the monitoring activities the employer will carry out. Thus, the employer has the power to collect data, whether through direct observation or technology-based methods, such as cameras, sensors, geolocation systems, and device tracking (e.g., laptops, smartphones) (Ponce Del Castillo & Molè, 2024, p. 158).
Under Article 4(7), the employer determines the necessity of data collection for company operations, ranging from basic needs like employees' bank account numbers for salary payments to more intrusive data processing for legitimate purposes such as performance monitoring, appraisals, and health and safety management (Article 29 Data Protection Working Party, 2014, p. 17). Consequently, a widely accepted axiom in labour law literature is that any employer implementing a workforce management or monitoring system, determining the purposes and means of data processing, is accountable as the data controller (Abraha, 2023; Aloisi & Gramano, 2019).
The employer as data controller under Article 24 ("Responsibility of the controller") shall demonstrate compliance with the GDPR, in particular by adopting appropriate technical and organisational measures to adhere to its principles and obligations. Its role, therefore, is not only about determining purposes and means. Article 25 ("Data protection by default and by design") mandates "data protection by design," requiring employers to implement suitable technical and organisational measures aligned with the principles outlined in Article 5.37 Article 25(2), emphasises "data protection by default," meaning that any software solution adopted/developed by the controller should operate on preset values consistent with GDPR principles. The employer is required to concretely design and then implement systems that by design and by default operate in accordance with the principles and operational guidelines offered by the GDPR.38
Ultimately, these Articles mandate data controllers to create their processing by assessing the "risks of varying likelihood and severity for the rights and freedoms of natural persons posed by the processing."39 To track those risks, under Article 35 of the GDPR, they shall perform the Data Protection Impact Assessment (DPIA). This (preliminary) assessment under Article 35(3)(a) is mandatory in employment contexts since the processing of employees' data affects their legal status (e.g., performance evaluation, compliance with employer directives).40 The DPIA, according to Article 35(7), shall thoroughly examine their strict necessity to business operations and their proportionality in relation to employees' rights, and the necessary measures to mitigate and prevent possible risks.
In essence, the employer as data controller under Articles 4(7), 24, 25, and 35 is assumed to be an actor with sufficient expertise to design, implement, and understand the processing of employees' data and its risks according to the principles and obligation imposed by the GDPR.
A. EMPLOYERS AS CONSUMERS OF DATA PROCESSING TOOLS
Some corporations might possess the economic and technical capabilities in-house to develop personal data processing systems from the ground up. A pertinent example is Amazon, which utilises these resources to establish distinctive organisational models reliant on the continuous collection of employees' personal data, encompassing roles from warehouse employees to delivery drivers (Delfanti, 2021). This is not unique to Amazon; many platform-based businesses directly design and operate data-driven systems that rely on the continuous collection of personal data. These platforms gather data from workers - such as delivery times, routes, ratings, and behavioural metrics - to optimise services and exert control over labour processes (Lee, 2024).
In this article, however, I focus on an aspect often overlooked in discussions about workplace data processing: employers purchasing employee management and surveillance software from specialised firms. Cogito analyses call centre employees' voices to detect patterns, providing real-time feedback if their tone lacks friendliness with customers.41 Driveri, a camera-based system, monitors truck drivers and issues audible alerts if the driver looks away from the road or reaches for something in the cup holder.42 Perceptyx, an AI software, predicts the likelihood of workforce unionisation through a union vulnerability index.43 These are a few examples of companies that design, sell, and manage data processing systems, thereby facilitating the widespread adoption of such technologies as a product to be sold to employers. In 2021, the Little Tech database tracked this market for software and hardware that includes small to medium-sized commercial companies, business intelligence firms, military companies, start-ups, data brokers, and app developers (Negrón, 2021). Employers purchase systems from them, thus embedding a provider within the corporate structure, responsible for developing, maintaining, and possessing operational knowledge of the systems operating the data processing (Molè, 2022, pp. 91-92; Molè, 2024).
An information asymmetry emerges between the provider, who designs and implements the system, and employers, who simply utilises it within their organisation. Given the GDPR's focus on accountability for the data controller, how will employers, who are customers of these companies rather than "creators" of data processing systems, ensure compliance with their obligations? In this new scenario, it is pertinent to introduce another actor in the ecosystem of the GDPR: the processor.
B. DATA CONTROLLERS AND PROCESSORS
The GDPR provides that employers, as data controllers, may engage processors, as defined in Article 4(8), to process personal data on their behalf. Processors - encompassing various entities such as natural or legal persons, public authorities, agencies, or other bodies - are distinct entities from controllers: they are those companies developing, selling, and managing the data processing tools. Controllers can entrust to them some or all of the processing activities (European Data Protection Board, 2020b, pp. 25-26). The EDPB defines the processor as fully serving the data controller interests. It shall only follow instructions from the controller regarding the processing purpose and the fundamental aspects of the processing methods (European Data Protection Board, 2020b, p. 26). It is thus an instrumental actor in fulfilling the data controller's determination of purposes, means, and compliance with the GDPR. According to Article 28(10), if the processor exceeds the instructions received, it assumes the role of a data controller, since it is independently determining the processing activities.44
Under Article 28(1), the data controller has a duty to carefully select processors that provide "sufficient guarantees to implement appropriate technical and organizational measures" ensuring compliance with the GDPR and safeguarding the rights of the data subjects.45 The EDPB in its guidelines highlights as well that the employer-data controller must (i) evaluate whether the processor allows them adequate control over the processing, and (ii) assess potential risks (European Data Protection Board, 2020b, pp. 27-28).46 The relationship outlined by the GDPR between the processor and the data controller creates a unique dynamic: despite a vast market of processors-providers designing and contracting data processing services to data controllers - such as those for work management and surveillance - the latter, as customers of these companies, remains responsible to select only those processors that are compliant with the GDPR.
This raises the question of whether the GDPR properly addresses this reversed dynamic, where an employer purchasing a product - and not its actual developer - must ensure that its practical functioning complies by design and by default to the data protection framework. Has the processor specific duties in this regard - of information, collaboration, or coordinated compliance - towards the employer? A closer look at Article 28 reveals that the processor has loose obligations towards the data controller. Article 28(4) states that processors could adhere to codes of conduct47 or certification mechanisms48 to prove their compliance with the GDPR and discharging data controllers from checking it; yet the GDPR outlines these as possibilities and not obligations on the processor (Koulierakis, 2023). Article 28(3) requires the controller and processor to enter into an agreement setting out the subject-matter and duration of the processing, the nature and purpose of the processing, the type of personal data, categories of data subjects, and the obligations and rights of the controller. However, the content of this agreement is only broadly specified in Article 28(3) and does not mandate clear information and collaboration duties for the processor towards the data controller.
The processor has only a general obligation to provide the data controller with the necessary information whenever data subjects exercise their rights according to Articles 12-22,49 together with the duty to provide to the controller all information necessary to perform the DPIA (European Data Protection Board, 2020b, p. 40).50 Although a processor may be in a better position to conduct the DPIA, the controller is ultimately responsible for ensuring its completion (Kalsi, 2024, pp. 351-352). In fact, the processor is required to provide the information to the controller drafting a DPIA only "where necessary and upon request" of the latter.51 Consequently, it's the controller's - not the processor's - responsibility to initiate, oversee, and retrieve all of the relevant information for the successful and most accurate completion of a DPIA (European Data Protection Board, 2020b, p. 39). The EDPB further stresses that the processor shall retain only a general duty of assistance towards the data controller, excluding a transfer of responsibility (European Data Protection Board, 2020b, p. 39). The accountability outlined by the GDPR remains firmly entrusted to the data controller-employer, both at the design and the implementation stage of the data processing.
C. LOST IN TRANSLATION: EMPLOYERS DISGUISED AS "CREATORS" OF DATA PROCESSING
The employer must fulfil obligations under Articles 24, 25, and 35 by acting as an entity equipped with the expertise and technical tools to design and build a personal data processing system. Even when employers purchase and use software and hardware from highly specialised digital technology companies, they retain the status of data controllers and are responsible for ensuring compliance with the GDPR. They must be data controllers - according to the EDPB, and a large literature and case law consensus - since they determine the organisation of the company and thus which data processing is necessary for which purposes (European Data Protection Board, 2020b, p. 11; Abraha, 2023); this also makes it difficult to consider processors potential joint controllers in employment contexts.52
As seen, hierarchical primacy is the main feature of employment contracts. This primacy, according to the EDPB, would somehow mirror the one of the data controller since under Article 28(1), employers must select only GDPR-compliant processors. Although it may be a feasible eventuality, the scenario where the employer goes to the market to buy an innovative product establishes a different dynamic - namely, that the employer is a customer of a service, and as such should be given sufficient information and protection so that the producer does not conceal risks to employees' rights (as Cogito and the above examples seem to show (Wood, 2021, p. 6). This asymmetry of information and power is acknowledged in the recently approved EU AI Regulation, which mandates that providers must adhere to certain requirements and inform deployers (employers) on how to control and mitigate risks associated with AI systems before bringing them to market. The employer's role shifts from being the data processing "creator" outlined by Articles 4(7), 24, and 25 of the GDPR to being a vulnerable subject who receives protections against complex AI products sold to monitor and manage the workforce (Molè, 2024, pp. 182-184).53
The GDPR instead through Articles 24, 25, and 35, by targeting only data controllers, assumes data controllers can influence all other parties involved in data processing - including processors. This assumption of agency and expertise undermines robust data protection through technology design, as envisioned by the privacy by design and by default principles (Kalsi, 2024, pp. 347-348). The GDPR, while acknowledging that processors sell and provide services, does not assign them - as the EU AI Regulation does - specific compliance obligations similar to the ones today fictitiously imposed on the employer.54 Processors are in fact not required to disclose compliance details beyond facilitating data subject rights and cooperating with DPIAs, leaving employers without sufficient tools to fulfil their accountability duties. The complexity of the tools operated by processors does not absolve employers of their obligations under the GDPR. As data controllers, employers remain fully accountable for ensuring compliance with Articles 24, 25, and 35. This is precisely the problem: employers are tasked with the central responsibility of aligning data processing activities with the GDPR, yet processors are not required to disclose detailed information about the technicalities or risks associated with the IT products they design and provide.
Paradoxically, the EDPB guidelines stress that if an employer instruction breaches data protection laws, the processor shall have the right to terminate the contract (European Data Protection Board, 2020b, p. 39). An employer, as it stands, might be unaware of the functioning of the data processing if not by means of user dashboards. This further illustrates how the GDPR overlooks the impact of the market of new technologies in shaping data processing at work and, concurrently, overlooks the (missing) autonomy of employers in understanding and setting employees' data processing (Molè, 2024).
The GDPR's framework inadequately addresses the dynamics between data controllers and processors. This gap is based on the assumption that data controllers can independently and effectively select GDPR-compliant processors, which may not always be the case. Moreover, as seen previously in Section II, when employers are data controllers effectively designing and operating the data processing systems, they often possess greater control over data processing activities than typical data controllers, as they can exercise significant influence over data subjects compared to other kinds of data controllers. Consequently, equating the agency and autonomy of employers with that of data controllers under the GDPR oversimplifies the complexities inherent in employer-provider relationships and the unique challenges posed by employer-controlled data processing systems.
Applying to the employer the data controller status might generate a "Lost in Translation" effect. When employers effectively act as data controllers - designing and operating their own data processing systems - they exercise considerable influence and control over employees. This substantial power may warrant imposing more stringent legal obligations on them to safeguard employees' rights. However, as seen in this Section III, when employers rely on external providers for data processing, the assumption that they inherently understand and can manage associated risks with complex data processing may be misplaced. Employers often purchase software products they neither design nor manage, limiting their comprehension of these systems. This reliance can lead to an overestimation of employers' autonomy and agency, as their capacity to identify and mitigate risks associated with externally sourced tools for workplace management and surveillance may be constrained. Consequently, the GDPR's framework may inadequately address these nuances, potentially leaving employers accountable without sufficient means or rights to fulfil their obligations effectively.
IV. IN SEARCH OF A PROPER TRANSLATION: ADDRESSING AGENCY GAPS
The proposed analysis of "Employees as Data Subjects" and "Employers as Data Controllers" in the GDPR has been inspired by a series of intuitions within the rich scholarship intersecting labour law and data protection law. In 2016, Marta Otto wrote about a major misconception, namely that privacy at work shall not be regarded as only a complementary right but a proper social right bounded to dignity and just conditions at work (Otto, 2016, p. 195). In 2019, Emanuele Dagnino and Ilaria Armaroli discussed the essentially reactive and defensive role of collective actors in the introduction of new workplace technologies (Dagnino & Armaroli, 2019). By 2023, Abraha highlighted the "individualistic nature of data protection law," noting that "the GDPR's exclusive focus on individual data subjects and individual rights does not easily fit with workers' rights and interests" (Abraha, 2023, p. 184). Finally, Dan Calacci and Jake Stein, also in 2023, emphasised that "Regulating data collection and use in the workplace is now more a matter of regulating working conditions than data protection" (Calacci & Stein, 2023, p. 253).
What emerges as common to all these insights is a misalignment between the function of personal data regulations, specifically the GDPR, and those pertaining the employment relationships. The GDPR assumes a certain baseline of agency for both data subjects and data controllers. Data subjects' rights presume that the data subject is a reasonably well-informed, observant, and circumspect individual,55 and that the data controller is either the "creator" of the data processing activities or, at the very least, has the capacity/tools to steer it towards compliance with the GDPR.56 However, these assumptions break down in the employment context due to the inherently hierarchical nature of the employer-employee relationship and the not sufficiently regulated information asymmetry between the employer (data controller) and the providers (processors). As a result, applying the GDPR's general framework in those instances fails to address the unique power dynamics and vulnerabilities inherent in these relationships. This highlights the urgent need for labour-specific provisions that explicitly account for these disparities.
The search for a "proper translation" should start from premises identified by the CoE Recommendation No. R(89)2 and the Art. 29 WP opinions advocating for a regulatory approach proper to labour law (Article 29 Data Protection Working Party, 2001, 2017b). In fact, other assumptions about employers' and employees' agency are present in labour law. The values protected are different. While EU data protection, influenced by consumer law, emphasises the protection of individuals' private spheres against external interferences, labour law focuses on regulating hierarchical relationships to uphold social and economic values associated with work as a means of livelihood, personal fulfilment, and societal inclusion (Davidov, 2016, p. 54). By acknowledging these differences, it will be possible to properly regulate by accounting for the hierarchical dynamics of labour and its system of values. It is a warning that also comes from the European Court of Human Rights, which, in issues of workplace surveillance, has emphasised that "Due to the technological ease with which electronic surveillance can be carried out and disseminated [...] a clear and foreseeable legal framework, with appropriate and effective safeguards, becomes of paramount importance."57
In this regard, the labour law scholarship currently investigates whether these "appropriate and effective" safeguards are present in the recently approved EU AI Regulation,58 which targets providers of AI systems, and in the PWD,59 which regulates the use of algorithmic management by labour platforms (Cristofolini, 2024; Potocka-Sionek & Aloisi, 2025). These frameworks operate on the premise that AI systems for workforce management and monitoring, as well as labour platforms, must meet specific requirements regarding their development, implementation, and operation. Due to space constraints and a focus on the GDPR, a detailed analysis of their design and provisions is beyond this discussion. However, it is noteworthy that, compared to the GDPR, both the EU AI Regulation and the PWD impose prohibitions on certain data processing activities deemed inherently harmful.60 They also grant particular rights to platform workers61 and establish specific transparency and accountability obligations for AI system providers.62 This regulatory approach signifies a departure from the GDPR's framework by acknowledging the unique safeguards necessary for both workers and employers engaging with external providers, alongside imposing stringent obligations on employers who design and operate their own AI systems or labour platforms. Whether these legislations will be adequate to meet the challenges identified so far, or whether a more comprehensive framework for data processing at work is necessary, is a matter of debate and will require further investigation (Adams-Prassl et al., 2023; Albin, 2025).
V. CONCLUSION
This article explored the increasingly topical intersection between labour law and data protection - particularly the GDPR - by means of a legal analysis of its two main protagonists: the data subject and the data controller. Labour law scholarship so far has not thoroughly addressed who are data controllers and data subjects. Do their characteristics, capabilities, rights, obligations, and interests outlined in the GDPR correspond to those we would find placed upon employers and employees in an employment relationship?
Through an exploration of the data subject's rights63 and the data controller's obligations,64 I discussed how juxtaposing the GDPR's categories and regulatory approaches to employment contexts might create a "Lost in Translation" effect. The data subject status, as delineated by these rights, extends a consumer-oriented form of protection to employees. As I illustrated in Section II, the EU legislator built its notion of data subject and the related rights on the assumption - present in EU consumer law - that those subjects can be reasonably well-informed, observant, and circumspect. Consequently, they are expected to use the information received65 to exercise their rights and guide the data controller towards compliance with the GDPR in line with their wishes.66
Contrary to this consumer-oriented approach, employees may be reluctant to exercise their rights due to fear of job-related repercussions. Unlike consumer protection, which provides mandatory information to empower consumers in exercising due diligence when selecting and evaluating, labour law protection is fundamentally different and enjoys distinct legal protection. Employees in fact depend on their remuneration, and their agency is impacted by the hierarchical authority of the data controller. They might be in a position of not willing to express their "unambiguous wishes"67 in order to adjust to employers' ones, as pointed out by Simitis (1987, p. 723).
Employees who choose not to exercise their rights as data subjects remain data subjects and enjoy some forms or protection, nonetheless. They benefit from stronger protections on the processing of sensitive data,68 specific national legislation (where applicable),69 and the Art. 29 WP guidelines, which interpret consent as an invalid legal basis for processing in employment contexts (Article 29 Data Protection Working Party, 2017b). However, the overall protective aim of the GDPR is hampered in this context: key elements of the GDPR's framework, such as the rights of data subjects, might prove ineffective when applied to the unique dynamics of the employment relationship.
The employer, as data controller, is accountable for shaping systems of workforce management and surveillance according to the principles and obligations dictated by the GDPR.70 Employers, however, in some instances may use data processing systems designed and operated by external providers. By analysing the processors under Article 28 of the GDPR, I illustrated how these companies are not held sufficiently accountable for empowering employers in their accountability duties as data controllers. In these cases, the employer as data controller may act as a fictitiously empowered entity, which potentially undermines the protection of employee rights due to the lack of sufficient understanding and control over the data processing.
This led me to identify several factors suggesting that the employer-employee relationship cannot always be directly equated with the data controller-data subject relationship as outlined by the GDPR.
By stressing the substantial differences present in those relationships, this article provides the current debate with a conceptual framework that critically analyses the categories of the GDPR in relation to those of labour law. Its result shows that it is necessary to move beyond the GDPR - and, more broadly, "general" data protection frameworks - in regulating data processing at work, and to move forward with tailored regulatory approaches that address the unique aspects of employment relationships.
VI. REFERENCES
Abraha, H. H. (2022). A pragmatic compromise? The role of Article 88 GDPR in upholding privacy in the workplace. International Data Privacy Law, 12(4), 276-296. https://doi. org/10.1093/idpl/ipac015
Abraha, H. (2023). Regulating algorithmic employment decisions through data protection law. European Labour Law Journal, 14(2), 117-332. https://doi. org/10.1177/20319525231167317
Adams-Prassl, J., Abraha, H., Kelly-Lyth, A., Silberman, M. S., & Rakshita, S. (2023). Regulating algorithmic management: A blueprint. European Labour Law Journal, 14(2), 124-151. https://doi.org/10.1177/20319525231167299
Albin, E. (2025). The three-tier structural legal deficit undermining the protection of employees' personal data in the workplace. Oxford Journal of Legal Studies, 45(1), 81-107. https://doi. org/10.1093/ojls/gqae033
Aloisi, A., & De Stefano, V. (2024, March 16). "Gig" workers in Europe: The new platform of rights. Social Europe. https://www.socialeurope.eu/gig-workers-in-europe-the-new-platform-of-rights
Aloisi, A., & Gramano, E. (2019). Artificial intelligence is watching you at work: Digital surveillance, employee monitoring, and regulatory issues in the EU context. Comparative Labor Law & Policy Journal, 41(1), 95-121. https://papers.ssrn.com/sol3/papers. cfm?abstract_id=3399548
Article 29 Data Protection Working Party. (2001). Opinion 8/2001 on the processing of personal data in the employment context. https://ec.europa.eu/justice/article-29/documentation/ opinion-recommendation/files/2001/wp48_en.pdf
Article 29 Data Protection Working Party. (2010). Opinion 1/2010 on the concepts of "controller" and "processor."<https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2010/wp169_en.pdf
Article 29 Data Protection Working Party. (2014). Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC. https://ec.europa.eu/ justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf
Article 29 Data Protection Working Party. (2017a). Guidelines on Data Protection Impact Assessment (DPIA). https://ec.europa.eu/newsroom/article29/items/611236/en
Article 29 Data Protection Working Party. (2017b). Opinion 2/2017 on data processing at work. https://ec.europa.eu/newsroom/article29/items/610169
Article 29 Data Protection Working Party. (2018). Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679. https://ec.europa.eu/newsroom/ article29/items/612053
Bogoeski, V. (2023). Nonwaivability of labour rights, individual waivers and the emancipatory function of labour law. Industrial Law Journal, 52(1), 179-213. https://doi. org/10.1093/indlaw/dwac020
Cabrita, J., & Böhmer, S. (with Galli da Bino, C., & European Foundation for the Improvement of Living and Working Conditions). (2016). Working time developments in the 21st century: Work duration and its regulation in the EU. Publications Office of the European Union. https://doi.org/10.2806/888566
Calacci, D., & Stein, J. (2023). From access to understanding: Collective data governance for workers. European Labour Law Journal, 14(2), 253-282. https://doi. org/10.1177/20319525231167981
Cogito: Real-Time AI Coaching & Guidance for Contact Centers. (n.d.). [Computer software]. https://cogitocorp.com
Cohen, J. E. (2013). What privacy is for. Harvard Law Review, 126(7), 1904-1933. https:// harvardlawreview.org/print/vol-126/what-privacy-is-for/
Collins, H. (1986). Market power, bureaucratic power, and the contract of employment. Industrial Law Journal, 15(1), 1-14. https://doi.org/10.1093/ilj/15.1.1
Collins, H. (2010). Employment law (2nd ed.). Oxford University Press.
Collins, P. M. (2022). Putting human rights to work: Labour law, the ECHR, and the employment relation (1st ed.). Oxford University Press. https://doi.org/10.1093/ oso/9780192894595.002.0005
Council of Europe. (1989). Recommendation No. R(89)2 of the Committee of Ministers to member states on the protection of personal data used for employment purposes. https:// rm.coe.int/1680500b15
Cristofolini, C. (2024). Navigating the impact of AI systems in the workplace: strengths and loopholes of the EU AI Act from a labour perspective. Italian Labour Law E-Journal, 17(1), 75-103. https://doi.org/10.6092/issn.1561-8048/19796
Dagnino, E., & Armaroli, I. (2019). A seat at the table: Negotiating data processing in the workplace. A national case study and comparative insights. Comparative Labor Law & Policy Journal, 41(1), 173-195.
Davidov, G. (2016). A purposive approach to labour law. Oxford University Press. https://doi. org/10.1093/acprof:oso/9780198759034.001.0001
Davidov, G. (2020). Non-waivability in labour law. Oxford Journal of Legal Studies, 40(3), 482-507. https://doi.org/10.1093/ojls/gqaa016
De Schutter, O. (2013). Human rights in employment relationships: Contracts as powers. In F. Dorssemont, K. Lörcher, & I. Schömann (Eds.), European convention on human rights and the employment relation (pp. 105-140). Bloomsbury Publishing. https://doi. org/10.5040/9781474200301
Delfanti, A. (2021). Machinic dispossession and augmented despotism: Digital work in an Amazon warehouse. New Media & Society, 23(1), 39-55. https://doi. org/10.1177/1461444819891613
European Data Protection Board. (2020a). Guidelines 4/2019 on article 25 data protection by design and by default. https://www.edpb.europa.eu/sites/default/files/files/file1/edpb_ guidelines_201904_dataprotection_by_design_and_by_default_v2.0_en.pdf
European Data Protection Board. (2020b). Guidelines 07/2020 on the concepts of controller and processor in the GDPR. https://www.edpb.europa.eu/system/files/2023-10/EDPB_ guidelines_202007_controllerprocessor_final_en.pdf
European Data Protection Board. (2022). Guidelines 01/2022 on data subject rights - Right of access. https://www.edpb.europa.eu/system/files/2023-04/edpb_guidelines_202201_data_ subject_rights_access_v2_en.pdf
European Data Protection Supervisor. (2014). Privacy and competitiveness in the age of big data: The interplay between data protection, competition law and consumer protection in the digital economy. https://edps.europa.eu/sites/edp/files/publication/14-03-26_competitition_law_ big_data_en.pdf
Fenwick, C., & Novitz, T. (2010). Human rights at work: Perspectives on law and regulation. Bloomsbury Publishing. http://dx.doi.org/10.1353/hrq.2011.0051
Finkin, M. W. (1995). Privacy in employment law (1st ed.). Bureau of National Affairs.
Freedland, M. R., & Kountouris, N. (2011). The legal construction of personal work relations. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199551750.001.0001
Hendrickx, F. (2022). Protection of workers' personal data: General principles (ILO Working Paper No. 62). International Labour Organization. https://webapps.ilo.org/static/english/intserv/ working-papers/wp062/index.html
Kalsi, M. (2024). Still losing the race with technology? Understanding the scope of data controllers' responsibility to implement data protection by design and by default. International Review of Law, Computers & Technology, 38(3), 346-368. https://doi.org/10.10 80/13600869.2024.2324546
Kerber, W. (2016). Digital markets, data, and privacy: Competition law, consumer law and data protection. Journal of Intellectual Property Law & Practice, 11(11), 856-866. https://doi. org/10.1093/jiplp/jpw150
Koulierakis, E. (2023). Certification as guidance for data protection by design. International Review of Law, Computers & Technology, 38(2), 245-263. https://doi.org/10.1080/1360 0869.2023.2269498
Lee, S. (2024). The "AI mode": How food delivery riders in the Netherlands and South Korea experience algorithmic management. Italian Labour Law E-Journal, 17(2), 67-88. https:// doi.org/10.6092/issn.1561-8048/20869
Lynskey, O. (2014). Deconstructing data protection: The "added-value" of a right to data protection in the EU legal order. International and Comparative Law Quarterly, 63(3), 569-597. https://doi.org/10.1017/S0020589314000244
Malgieri, G. (2023). Vulnerability and data protection law. Oxford University Press. https://doi. org/10.1093/oso/9780192870339.001.0001
Mantouvalou, V. (2014). The right to non-exploitative work. In V. Mantouvalou (Ed.), The right to work - legal and philosophical perspectives (pp. 39-60). Bloomsbury Publishing.
Molè, M. (2022). The internet of things and artificial intelligence as workplace supervisors: Explaining and understanding the new surveillance to employees beyond Art. 8 ECHR. Italian Labour Law E-Journal, 15(2), 87-103. https://doi.org/10.6092/ ISSN.1561-8048/15598
Molè, M. (2024). Commodified, outsourced authority: A research agenda for algorithmic management at work. Italian Labour Law E-Journal, 17(2), 169-188. https://doi. org/10.6092/issn.1561-8048/20836
Molè, M., & Mangan, D. (2023). "Just more surveillance": The ECtHR and workplace monitoring. European Labour Law Journal, 14(4), 694-700. https://doi. org/10.1177/20319525231201274
Moore, P. V. (2022). Problems in protections for working data subjects: The social relations of data production. Global Political Economy, 1(2), 257-270. https://doi.org/10.1332/UKGE7444
Negrón, W. (2021). Little tech is coming for workers: A framework for reclaiming and building worker power. CoWorker.org. https://home.coworker.org/wp-content/uploads/2021/11/Little-Tech-Is-Coming-for-Workers.pdf
Netradyne. (n.d.). Driveri. https://www.netradyne.com/
Niezna, M. (2024). Consent to labour exploitation. Industrial Law Journal, 53(1), 3-33. https:// doi.org/10.1093/indlaw/dwad036
Niezna, M., & Davidov, G. (2023). Consent in contracts of employment. The Modern Law Review, 86(5), 1134-1165. https://doi.org/10.1111/1468-2230.12802
Otto, M. (2016). The right to privacy in employment: A comparative analysis. Hart Publishing. http://dx.doi.org/10.5040/9781509906147
Perceptyx. (n.d.). [Computer software]. https://www.perceptyx.com
Ponce Del Castillo, A. (2016). Occupational safety and health in the EU: Back to basics. In B. Vanhercke, D. Natali, & D. Bouget (Eds.), Social policy in the European Union: State of play 2016 (pp. 131-155). European Trade Union Institute; European Social Observatory. https:// www.etui.org/sites/default/files/Chap%205.pdf
Ponce Del Castillo, A., & Molè, M. (2024). Worker monitoring vs worker surveillance: The need for a legal differentiation. In A. Ponce Del Castillo (Ed.), Artificial intelligence, labour and society (pp. 157-173). European Trade Union Institute. https://www.etui.org/sites/default/ files/2024-03/Chapter13_Worker%20monitoring%20vs%20worker%20surveillance%20 the%20need%20for%20a%20legal%20differentiation.pdf
Potocka-Sionek, N., & Aloisi A. (2025). The spillover effect of algorithmic management and how (not) to tame it. In K. Vandaele & S. Rainone (Eds.), The Elgar companion to regulating platform work: Insights from the food delivery sector (pp. 253-271). Edward Elgar Publishing
Prassl, J. (2015). The concept of the employer (1st ed.). Oxford University Press. https://doi. org/10.1093/acprof:oso/9780198735533.001.0001
Ratti, L. (2023). The sword and the shield: The directive on adequate minimum wages in the EU. Industrial Law Journal, 52(2), 477-500. https://doi.org/10.1093/indlaw/dwad001
Risak, M. E., & Dullinger, T. (2018). The concept of "worker" in EU law: Status quo and potential for change. European Trade Union Institute. https://www.etui.org/publications/reports/the-concept-of-worker-in-eu-law-status-quo-and-potential-for-change
Schermer, B. W., Custers, B., & van der Hof, S. (2014). The crisis of consent: How stronger legal protection may lead to weaker consent in data protection. Ethics and Information Technology, 16(2), 171-182. https://doi.org/10.1007/s10676-014-9343-8
Simitis, S. (1986). Gesetzliche regelungen für personalinformationssysteme - Chancen und grenzen. In A. Von Schoeler (Ed.), Informationsgesellschaft oder Überwachungsstaat? (pp. 43-64). VS Verlag für Sozialwissenschaften https://doi.org/10.1007/978-3-322-89373-4_3
Simitis, S. (1987). Reviewing privacy in an information society. University of Pennsylvania Law Review, 135(3), 707-746. https://doi.org/10.2307/3312079
Supiot, A. (2015). Critique du droit du travail. Presses Universitaires de France.
Svantesson, D. J. B. (2018). Enter the quagmire - the complicated relationship between data protection law and consumer protection law. Computer Law & Security Review, 34(1), 25-36. https://doi.org/10.1016/j.clsr.2017.08.003
Trzaskowski, J. (2011). Behavioural economics, neuroscience, and the unfair commercial practises directive. Journal of Consumer Policy, 34(3), 377-392. https://doi.org/10.1007/ s10603-011-9169-2
Trzaskowski, J. (2022). Data-driven value extraction and human well-being under EU law. Electronic Markets, 32(2), 447-458. https://doi.org/10.1007/s12525-022-00528-0
Viljoen, S. (2021). A relational theory for data governance. The Yale Law Journal, 131(2), 573-654. https://www.yalelawjournal.org/pdf/131.2_Viljoen_1n12myx5.pdf
Wood, A. J. (2021). Algorithmic management: Consequences for work and organisation and working conditions (JRC Publications Repository No. JRC124874). European Commission. https:// publications.jrc.ec.europa.eu/repository/handle/JRC124874
* Ph.D. researcher in labour law, University of Groningen. I am grateful to the anonymous reviewers and Einat Albin for their valuable comments and suggestions on this article. I am also grateful for the feedback received from the participants to the Privacy@Work workshop organised by Einat Albin in May 2023.
1. See, among others, the European Court of Human Rights: "Due to the technological ease with which electronic surveillance can be carried out [...] a clear and foreseeable legal framework, with appropriate and effective safeguards, becomes of paramount importance," 1874/13 and 8567/13, López Ribalda and Others v. Spain, ECLI:CE:ECHR:2019:101 7JUD000187413 (2019), Joint dissenting opinion of Judges De Gaetano, Yudkivska, and Grozev, para 4.
2. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Reg. 2016/679 (GDPR), Article 4(7).
3. GDPR, Article 4(1).
4. GDPR, Article 88.
5. Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence. EU Reg. 2024/1689 (EU AI Regulation).
6. Directive (EU) 2024/2831 of the European Parliament and of the Council of 23 October 2024 on improving working conditions in platform work. EU Dir. 2024/2831 (PWD).
7. GDPR, Articles 12-22.
8. GDPR, Articles 24, 25, 35, among others.
9. According to Article 4(2) GDPR, data processing means "any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction."
10. GDPR, Articles 5, 6, 9, among others
11. Case C-66/85 Deborah Lawrie-Blum v Land Baden-Wu rttemberg [1986] ECR 2121, para 17. See also: ECJ in, C-197/86 Steven Malcolm Brown v The Secretary of State for Scotland [1988] ECR 03205, para 21; ECJ in, C-344/87 Bettray, [1989] ECR 01621, para 14; C-94/07, Andrea Raccanelli v Max-Planck-Gesellschaft zur Förderung der Wissenschaften eV, [2008] ECR I-05939, para 34.
12. GDPR, Chapter III.
13. GDPR, Article 15.
14. GDPR, Articles 13-14.
15. GDPR, Article 16.
16. GDPR, Article 15.
17. Case C-434/16, Novak v Data Protection Commissioner, ECLI:EU:C:2017:994.
18. GDPR, Recital 71. See also Article 29 Data Protection Working Party (2018, p. 21).
19. GDPR, Articles 15-22.
20. GDPR, Article 88.
21. GDPR, Article 12-22.
22. GDPR, Recital 4: "This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity." See also Viljoen (2021, pp. 623-628).
23. Those competences are part of the ones shared with the Member States; see Article 4 TFEU.
24. GDPR, Recital 12: "Article 16(2) TFEU mandates the European Parliament and the Council to lay down the rules relating to the protection of natural persons with regard to the processing of personal data and the rules relating to the free movement of personal data."
25. GDPR, Recital 7: "Those developments [in digital technologies] require a strong and more coherent data protection framework in the Union, backed by strong enforcement, given the importance of creating the trust that will allow the digital economy to develop across the internal market." See also Viljoen (2021, pp. 623-628).
26. See also Kerber (2016, p. 856).
27. The Directive refers to the following case: CJEU Case C-210/96 Gut Springenheide GmbH and Rudolf Tusky v Oberkreisdirektor des Kreises Steinfurt [16 July 1998].
28. EU Reg. 2011/1169, Recital 41.
29. GDPR, Article 12-22.
30. GDPR, Article 9.
31. GDPR, Article 88.
32. GDPR, Article 12-22.
33. GDPR, Articles 12-22.
34. Respectively: Right to lodge a complaint with a supervisory authority (Article 77), Right to an effective judicial remedy against a supervisory authority (Article 78), Right to an effective judicial remedy against a controller or processor (Article 79), and Right to compensation and liability (Article 82).
35. The table is reworked from Svantesson (2018, p. 29).
36. GDPR, Articles 12-14.
37. Due to space limitations, I am unable to delve into the principles that data controllers must implement when establishing data processing. These principles, as detailed in Article 5, include: (a) lawfulness, fairness, transparency; (b) purpose limitation; (c) data minimisation; (d) accuracy; (e) storage limitation; (f) integrity and confidentiality. A more in-depth explanation is available in the EDPB guidelines. See European Data Protection Board (2020a, p. 5).
38. Article 25(1) refers in particular to techniques such as pseudonymisation and the minimisation principles as operational guidelines for data controllers establishing data processing. See European Data Protection Board (2020a, p. 11).
39. GDPR, Article 25.
40. Data processing is further highlighted as a "high risk" one in Recital 75 of the Regulation and in the guidelines on the DPIA. See Article 29 Data Protection Working Party (2017a, p. 11).
41. Developed by Cogit°Corp. See Cogito: Real-Time AI Coaching & Guidance for Contact Centers (n.d.).
42. Developed by Netradyne Inc. See Netradyne (n.d.).
43. Developed by Perceptyx Inc. See Perceptyx (n.d.).
44. The EDPB outlines "controllers" and "processors" as functional concepts. See European Data Protection Board (2020b). See further in the previous guidelines, Article 29, Data Protection Working Party (2010, p. 9). See also: Case C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja Gonza lez, judgment of 13 May 2014, paragraph 34; Case C-210/16, Wirtschaftsakademie Schleswig-Holstein, judgment of 5 June 2018, paragraph 28; CJEU, Case C-40/17, Fashion ID GmbH & Co.KG v Verbraucherzentrale NRW eV, judgment of 29 July 2019, paragraph 66.
45. GDPR, Recital 81: "To ensure compliance with the requirements of this Regulation (...) the controller should use only processors providing sufficient guarantees, in particular in terms of expert knowledge, reliability and resources, to implement technical and organisational measures which will meet the requirements of this Regulation." This allows then the data controller to demonstrate compliance with the GDPR as required by Article 24 ("Responsibility of the controller").
46. The Italian DPA fined the company Wind Tre for unauthorised marketing practices, citing the company's failure to control its processors, who conducted campaigns without proper oversight. Similarly, the Spanish DPA fined Vodafone for not ensuring its processor maintained GDPR compliance. See Kalsi (2024, p. 357)
47. GDPR, Article 40.
48. GDPR, Article 42.
49. GDPR, Article 28(3)(e).
50. GDPR, Article 28 (3)(f).
51. GDPR, Recital 95.
52. GDPR, Article 26: "Where two or more controllers jointly determine the purposes and means of processing, they shall be joint controllers."
53. EU Reg. 2024/1689 (AI Regulation), Article 79(1) refers to Article 3, point 19 of EU Reg. 2019/1020 on market surveillance and compliance of products, which defines the features of dangerous products and how they should be implemented in a working environment. The employer shall implement it following the manufacturer's guidelines and promptly report malfunctions. This illustrates how the AI Regulation focuses on requirements for providers to allow the deployer-employer having an understanding of the purchased AI system.
54. GDPR, Recital 81.
55. GDPR, Articles 12-22.
56. GDPR, Articles 24 and 25.
57. López Ribalda and Others v. Spain, App no 1874/13 and 8567/13 (ECtHR, 17 October 2019), Joint dissenting opinion of Judges De Gaetano, Yudkivska, and Grozev, para 4. 58. AI Regulation, EU Reg. 2024/1689.
59. PWD, EU Dir. 2024/2831.
60. For instance, AI Regulation, Article 5(1)(f) prohibits "AI systems to infer emotions of a natural person in the areas of workplace and education institutions." See also the prohibitions of PWD, Article 7.
61. PWD, Articles 7-15.
62. AI Regulation, Articles 6-49.
63. GDPR, Articles 12-22.
64. GDPR, Articles 24, 25, 35.
65. GDPR, Articles 12-14.
66. GDPR, Articles 15-22.
67. GDPR, Article 4(11).
68. GDPR, Article 9.
69. GDPR, Article 88.
70. GDPR, Articles 24, 25, 35.
Copyright Comparative Labor Law & Policy Journal 2025