Abstract
This paper draws upon James Scott's Seeing Like a State (1998) to argue that privacy law currently suffers from (at least) three defects: a focus on the legibility of individuals that is too narrow, a focus on collection and subsequent use of data that comes too late, and a focus on rights and harms that ignores the need to create new social structures that can empower more local forms of collective decision-making. What this outlines in broad brushstrokes is the need to enfold privacy concerns within a broader data governance framework concerned with the fair and just terms of social legibility.
In December 2021, the news broke in Canada that the federal government had accessed information from thirty-three million mobile devices during a COVID-19 public health lockdown (Oli 2021). Public criticism was swift and included a media campaign to call for stronger privacy laws, ones that strengthen consent and transparency and provide meaningful penalties for their breach (Open Media 2022). One of the telecommunications companies involved, Telus, denied that they had breached any laws-because the data at issue had been properly de-identified-and defended their "Data for Good" program that aimed to provide aggregate de-identified data for socially beneficial purposes (Canada 2022). The House Committee on Access to Information, Privacy and Ethics held hearings and heard testimony from many different parties. Recommendations from the House Committee included improved transparency, meaningful consultation with the Privacy Commissioner even if the data is de-identified, and that users be permitted to opt-out of this type of data use (Canada 2022).
Two things stand out from this story. The first is the way in which the architecture of current privacy laws, which focus on regulating personally identifiable information, fails. Under the current model, de-identified information is not regulated in Canada, resulting in no formal regulatory oversight of the ex-ante process of de-identification. This opens a large space for social and political distrust to take root. The second is that even if the process of de-identification is managed responsibly, there remains a question of who decides about the subsequent information use(s)-the data donors who are no longer identifiable, the government, the private companies who originally collected the data, or some other option?
In the rest of this comment, I want to focus on this second issue and outline why I think the "who decides?" question is deeply challenging and should point us away from a focus on privacy rights towards a focus on what I will call the fair and just terms of "social legibility." Privacy law scholarship will need to enlist the insights and methodologies of many other disciplines, including surveillance studies, in order to address this new agenda. However, as a privacy law scholar, my aim is to outline what this reform agenda should look like and invite others to the conversation rather indicate what I think these others might have to say.
The "who decides?" question is ultimately a question about authority and raises issues of power and legitimacy. Privacy rights can be seen as one traditional answer to this question, the answer being that it is the individual data subject who should decide about the collection and use of data about her, subject to the limits that might be justified in a free and democratic society. As many theorists have pointed out, privacy rights are not just of individual importance but protect important social values as well (Regan 1995). Strengthening these rights-such as by extending them to data "from," but no longer "about," an individually identifiable data subject-is one strategy for addressing contemporary concerns about state and private sector power. However, I think that this response will fail. In order to outline why I think this will fail, in what follows, I take three lessons from James Scott's work Seeing Like a State (1998) and indicate why these lessons point us past privacy and instead towards social legibility. In choosing this work, I do not mean to suggest that other theorists more generally taken up by either privacy law scholars or surveillance studies scholars do not make these points in their own way. But I offer these observations more in the spirit of Koops' (2022) admonition in this issue that scholars should read more and that we should read outside our lists of usual suspects.
What is social legibility? I take the term "legibility" from Scott (1998: 183) and his observations on the relationship between legibility and governance:
Legibility is a condition of manipulation. Any substantial state intervention in society- to vaccinate a population, produce goods, mobilize labor, tax people and their property, conduct literacy campaigns, conscript soldiers, enforce sanitation standards, catch criminals, start universal schooling-requires the invention of units that are visible. The units in question might be citizens, villages, trees, fields, houses, or people grouped according to age, depending on the type of intervention. Whatever the units being manipulated, they must be organized in a manner that permits them to be identified, observed, recorded, counted, aggregated, and monitored.
Legibility is about making some aspect of the social "visible" in order to govern. The coupling of legibility and governance is important, for it highlights the connection between visibility and the power others can have over you. However, the first lesson I want to draw from Scott (1998) is that legibility is not necessarily about individuals but can be about groups and is not necessarily even about people (e.g., trees, fields) but about making aspects of the world legible in order to govern people (and not necessarily as individuals). As Galič (2022) argues in this issue, it is this broader idea of legibility that is at stake in many contemporary data practices, such as those she documents in relation to smart cities.
If we should be concerned with social legibility rather than privacy, then privacy law should give way to a broader regulatory regime of data governance. Data governance here is meant to be much more than simply signifying the ways in which organizations implement their obligations under privacy statutes. It focuses on how data about people are governed as well as the ways in which data are used to govern people, and the associated questions of authority and legitimacy.
Although governance is associated with the exercise of power and authority, there can be many different models of governance. There are not only authoritarian models of governance but also democratic models, not only corporate governance but also self-governance and community governance. Governance can be oppressive and can also enable coordinate social activity for laudable collective purposes. We need to develop our understanding of the fair and just terms of social legibility and make this the foundation for models of data governance.
The second lesson that I want to take from Scott's (1998) work is that practices of legibility are not mapping exercises that aim at "seeing" what is already there but are acts of world-making. Increasingly we live in an age of "data," with its dominant meaning of "that which is given" (Rosenberg 2021). Indeed, we often hear talk about "raw data" as though it connotes a basic ground truth about the world and data as a "resource" to be allocated and used. But Scott's (1998) analysis reminds us that practices of legibility are not about accessing what is "really" there but are instead about the imposition of a representation on diverse and complex social phenomena. Practices of legibility can also distort and push out other practices. For example, Scott (1998) documents many modernist planning practices that distort forms of local knowledge as they shape the world into an "administrative grid" suitable for particular governance objectives. Today it is often the private sector that has taken on the primary power of remaking the world. The basis point here is that "data" are not a neutral resource reflecting what is "given" but something that are both shaped and, in turn, do the shaping. Privacy law's interventions at the point of collection or use of data is too late if we are interested in analyzing relations of power in our practices of social legibility.
The third lesson I want to draw from Scott's (1998) work is the need to think broadly about the social structures that are important for creating conditions of illegibility that can assist in resisting problematic forms of governance. While Scott (1998) acknowledged the successes of modernist planning, he also warned against authoritarian versions. For him, the three decisive factors to resisting authoritarian high modernist planning have been: (1) a private sphere where the state cannot interfere; (2) a strong private sector, where the economy is too complex to be managed; and (3) democracy, because in such settings "high modernist schemes... must accommodate themselves sufficiently to local opinion in order to avoid being undone at the polls" (Scott 1998: 102). On this account, a robust private sector and private sphere are important aspects of illegibility' in relation to the state and so of resistance to state power. In contrast, what we see today is state governance goals utilizing data created by the private sector. Rather than constrain state power, the private sector can augment (or supplement) state power in some circumstances-something that became clear with the Snowden revelations. The private sector has also, in many contexts, displaced the state as the locus of concern regarding power as we now are concerned about how the private sector utilizes data to effectively govern many aspects of contemporary life. This private governance operates at a global scale, making it difficult to bring under democratic control.
There is much to unpack here, but I want to pull on one thread, and I want to do so from the perspective of private law (but not privacy law). The recent attention in legal scholarship to the role of competition law in constraining private power is one example of thinking about how law structures power relations in the private sector. In fact, many aspects of the private sector are constructed by law. This is not law in the form of doctrines that vindicate individual rights or provide redress to various harms but rather law in its facilitative function. Legal frameworks like contract and corporate law provide legal structures for coordinating social activity and collective agency. A question we should have in the data age is whether we need new legal forms that can facilitate new modes of social organization and collective agency in relation to data, forms that would facilitate the creation of structures of resistance to both private sector and state power. There is growing interest in forms like data trusts and data collectives and debate about what these should look like and whether they can be constructed through existing tools of private ordering (Austin and Lie 2021; Huq 2021; Viljoen 2021). But this debate will be far too narrow if we think only in terms of how to create intermediaries to better manage individual rights of control (Delacroix and Lawrence 2019). We need also to think about what forms of collective governance over social legibility are desirable and open space for a more varied set of answers to the "who decides?" question, including by amplifying something surveillance studies has, at turns, done well-learning from and amplifying the practices of those most acutely subordinated by surveillance regimes.
To sum up, the three lessons for privacy law that I take from Scott's Seeing Like a State (1998) are that a focus on the legibility of individuals is too narrow, a focus on collection and subsequent use of data comes too late, and a focus on rights and harms ignores the need to create new social structures that can empower more local forms of collective decision-making. What this outlines in broad brushstrokes is the need to enfold privacy concerns within a broader data governance framework concerned with the fair and just terms of social legibility. To return to the "who decides?" question, we might say that when we turn our focus away from identifiable information then the answer might sometimes be individual data donors, democratic institutions, or corporations; but sometimes it might be social groups of various kinds acting together. And if we are interested not just in the formal or legal answer to the "who decides?" question, then we need to ensure that the practices of social legibility underpinning our "data" are ones that enable the forms of social governance we seek.
Finally, I want to be clear that shifting to a framework of data governance rooted in the fair and just terms of social legibility does not mean downplaying the importance of privacy. Privacy remains of vital importance. However, thinking about privacy as one aspect of a broader concern for social legibility can help us think about a range of interests that individuals might have in relation to data and that might justify rights of control, participation, or contestation. Let me give one example, tied back to my original story about Canadian mobility data.
One of the recommendations that the House Committee made is that individuals be given the ability to optout of having their data shared with the government for socially beneficial purposes even if that information is responsibly de-identified (Canada 2022). It is not clear to me that privacy provides a way of understanding why this should be. The analysis offered in this short reflection suggests that the answer to the "who decides?" question should be rooted in a theory of the fair and just terms of social legibility and also that there is room for forms of collective governance that provide a different answer to the "who decides?" question than the seemingly available choices of individual, government, or corporation. We might be concerned, for example, that allowing opt-outs for important public health data will result in problematic data gaps that will have detrimental social impacts. But suppose we want to defend the individual opt-out option as a default in some cases of data-sharing for "social good," where the decision is being made as a matter of corporate altruism. If re-identification is not a real risk (suppose), then the opting out is not so much a matter of protecting privacy as it is not wanting to be co-opted into participating in something one disagrees with. An analogy might be not wanting to purchase clothing made with child labor or to invest in mutual funds that include certain kinds of companies. The issue there is not that the individual consumer is harmed in some way or has their rights violated-it is that they have a stake in having the social sphere reflect their own values. A focus on the fair and just terms of social legibility-the theorizing of which will require attention to ideas of participation and contestation, social values, and social harms, among others- provides a potentially richer lens through which to think about these questions. And this will, in turn, allow us to be much sharper about which issues are in fact issues about privacy without the fear that if we say that they are not, then somehow that means everything else is permitted.
References
Austin, Lisa M., and David Lie. 2021. Data Trusts and the Governance of Smart Environments: Lessons from the Failure of Sidewalk Labs' Urban Data Trust. Surveillance and Society 19 (2): 255-261.
Delacroix, Sylvie, and Neil D. Law:rence. 2019. Bottom-up Data Trusts: Disturbing the "One Size Fits AH" Approach to Data Governance. International Data Privacy Law 9 (4): 236-252.
Galie, Maša. 2022. Smart Cities as "Big Brother Only to the Masses": The Limits of Personal Privacy and Personal Surveillance. Surveillance & Society 20 (3): 306-311.
Huq, Azziz. 2021. The Public Trust in Data. Georgetown Law Journal 110 (2): 333-402.
Koops, Bert-Jaap. 2022. Goodbye to Publications, or Confessions of a Privacy Law- Scholar. Sun·eillance <? Societу 20 (3): 312316.
Oli, Srvikar. 2021. Canada's Public Health Agency Admits It Tracked 33 Million Mobile Devices During Lockdown. National Post, December 21. https: nationalpost.comnew:s canada canadas-public-health-asencv-admits-it-tracked-33-million-mobiledevices-during-lockdow-n [accessed August 30, 2022].
Open Media. 2022. Telus Gave Your Location Data to the Feds. Now What? January 22. https:/, openmedia.org article itemteluslocation-data [accessed August 30,2022].
Regan, Priscilla M. 1995. Legislating Privacy: Technology, Social Values, and Public Policy. Chapel Hill, NC: University of North Carolina Press.
Rosenberg, Daniel. 2021. Data. In Information: A Historical Companion, edited by Ami Blair, Paul Duguid, Anja-Silvia Goeing, and Anthony Grafton, 387-391. Princeton, NJ: Princeton University Press.
Scott, James. 1998. Seeing Like a State: How Certain Scheme to Improve the Human Condition Have Failed. New Haven, CT: Yale University Press.
Canada. 2022. House of Commons. Standing Committee on Access to Information, Privacy and Ethics (ETHI). Collection and Use of Mobility Data by the Government and Related Issues. May. https:/,-'www.ourcomm.ons.caDocumentView'er/enAM1/ETHI, report-4/.
Viljoen, Salome. 2021. A Relational Theory of Data Governance. The Yale Law Journal 131 (2): 573-653.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022. This work is published under https://creativecommons.org/licenses/by-nc-nd/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
This paper draws upon James Scott's Seeing Like a State (1998) to argue that privacy law currently suffers from (at least) three defects: a focus on the legibility of individuals that is too narrow, a focus on collection and subsequent use of data that comes too late, and a focus on rights and harms that ignores the need to create new social structures that can empower more local forms of collective decision-making. What this outlines in broad brushstrokes is the need to enfold privacy concerns within a broader data governance framework concerned with the fair and just terms of social legibility.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 University of Toronto, Canada