Abstract
This article discusses smart surveillance based on the particular case of the Brazilian mobile app Monitora Covid-19 from the perspective of issues related to personal-data protection. Brazil is today one of the epicenters of the pandemic. The application under analysis is the tip of a wide network of data monitoring and medical assistance formed by public and private institutions. Based on a neo-materialist analysis of this network, this article discusses the use of surveillance technologies and data during the period of the pandemic; describes and comments on visible, discursive, and hidden materialities; and indicates the main issues of the application in the use and protection of users' personal data. In conclusion, it indicates some of the application's flaws in relation to personal-data protection. More broadly, it reinforces the need for the creation of publicly controlled regulatory bodies for smart surveillance systems that are able to oversee the application of (public and private) technologies with ethical assurances and public control.
Introduction
Moments of global crisis point to urgent reactions on the part of the state, accompanied by pressures from related industrial and corporate sectors and public concern. In these contexts, the government and population become more receptive to technological experiments that promise to deal with the crisis (Newell 2021; Firmino and Evangelista 2020). This is the case with the COVID-19 pandemic. The actions under scrutiny, exceptionally less rigid, are basically related to three types of situations: the search for a cure or preventive treatment for the problem (experiments and clinical tests in the healthcare field); the creation of facilities to provide a sense of normality (home office, distance learning, etc.); and the contingent management of the crisis (palliative solutions to economic loss, measures for monitoring contagion, treatment of infected members of the population, administration of the healthcare system and its treatment capacity, etc.).
One of the direct consequences of the new coronavirus pandemic associated with the third type of situation is, therefore, the mobilization of smart surveillance technologies as ways of controlling the contagion. Since there is still no cure, as the COVID-19 pandemic spreads around the world, the use of digital control, monitoring, and surveillance strategies are becoming more common in mitigation attempts. A variety of techniques are being tested in the world today: geo-localization with flow and movement mapping based on cell phone data; contact tracing using Bluetooth to identify those who have come into contact with infected people or those with symptoms; symptom tracking; drones for viewing crowds and helping to strengthen social distancing; electronic wristbands for monitoring; facial-recognition cameras; and thermal cameras for identifying people with fevers.
With the emergence of the pandemic, the use of data-monitoring techniques and technologies, characteristic of the current form of data capitalism (West 2019)-in line with what has also been termed surveillance capitalism (Zuboff 2019) and platform capitalism (Srnicek 2017)-has found fertile and favorable terrain for growth and uncritical adoption, and seems to be irreversible. This article intends to use Brazil as a concrete example of a country that is today one of the global epicenters of the crisis to reveal how this smart surveillance raises issues that affect policies concerning the use of personal data and privacy.
Based on the particular case of the Monitora Covid-19 mobile application, the article seeks to discover how intelligent mass surveillance produces and addresses issues regarding the privacy and protection of personal data. The mobile app is analyzed as an "object-network" from a neo-materialist, pragmatic, and immanent viewpoint (Fox and Alldred 2017; Lemos 2020) to investigate how it undertakes and raises specific questions related to data and privacy (discussion, documents, interfaces, institutions, safeguards, and policies). This approach allows us to verify not only the application in isolation but also the whole network it forms, which gives it meaning.1 The paper aims to emphasize an immanent analysis of the technology to unveil the discursive and material practices concerning the issues of data surveillance and privacy. Therefore, we argue that by examining the mobile app's many material layers (visible, discursive, and hidden), it becomes possible to engage in a broader discussion about the ethical deployment of such technologies during times of crisis-which is the de facto state of the pandemic in Brazil.
At the time of writing, Brazil has exceeded 645,000 deaths from COVID-19 and twenty-eight million people infected, the second highest death toll from COVID-19 in the world (behind only the US).2 But Brazil stands out for the level of its federal government's denialism and refusal to adopt the solutions most recommended by scientific knowledge-with its president, Jair Bolsonaro, leading the movement of disinformation about the disease and the pandemic and perfectly fitting to the image of a "covidiot" in the same way suggested by Trottier, Huang, and Gabdulhakov (2021). On one hand, this has exposed the population to a collection of ineffective responses that have led to higher death rates. On the other hand, it has meant weak adoption of more invasive surveillance technologies directly related to the health problem. Since the beginning of the pandemic. Brazil has had four different health ministers and remained several months without one until an interim solution for the then-sacked (second) minister was appointed as de facto minister, who was also replaced ten months later. In this context, actions for controlling the disease using surveillance technologies were carried out by state and local governments rather than by the federal authorities.
Given the lack of a comprehensive action coordinated by the Brazilian federal government,3 the chosen research object became one of the most relevant initiatives in the fight against the COVID-19 pandemic in the country. The Monitora Covid-19 application (Android and iOS) was established by the state government of Bahia in a partnership between public and private institutions. It was developed by Novetech4 and later adopted by the technological wing (FESF-tech) of the State Family Flealth Foundation (FESF). Following an agreement between FESF and the Northeast Consortium (bringing together state governors from northeast Brazil in the fight against the new coronavirus), the application is now operating under the Northeast Consortium Scientific Committee, and its operation has expanded to all states in the region and later to the whole country. The necessary investment for running the application is relatively low.5 It provides specialist medical attention remotely (through messaging via the application and telephone calls), geo-location data, individual monitoring of case evolution and secure consolidation of data about the pandemic.
There are other applications in Brazil, but among those most downloaded from the Google Play Store, Monitora Covid-19 is the one with the most robust and sophisticated construction and support network. An additional reason that justifies the chosen research object is the fact that the Monitora Covid-19 app is a key element in the political struggle between the Federal Government and its opponents. More specifically, the management of the pandemic has further intensified the political dispute between President Jair Bolsonaro and the governors of the Northeastern states, which is the only region where Bolsonaro had fewer votes than his opponent, Fernando Haddad, during the 2018 general elections.
The application's expected and ideal user base is formed by each and every person presenting COVID-19 symptoms. However, since Monitora Covid-19 is an optional sendee, it is not possible to enforce its usage by everyone. Mass adoption would be beneficial, though, since it potentially helps local public healthcare agencies in better understanding the current state of the pandemic. The aim is to provide local monitoring, remote medical assistance, and consequently, to relieve pressure on the public health senice and reduce the probability of contagion. More than 314,239 downloads had been recorded by August 2021, with 83,000 users monitored and 53,000 medical consultations conducted. Geo-referencing demonstrates that the application is underused in impoverished areas due to a lack of economic access to the technology and wireless networks. If a remote consultation is required, the protocol establishes a maximum twenty-fourhour timeframe for provision of the service.
The application is, in reality, the tip of a broad network of data monitoring and medical assistance and is the most important intelligent healthcare surveillance in use in Brazil, considering "intelligent" surveillance as the treatment of digital data through the use of specific algorithms in public data banks, on one hand, and coordinated and effective action by participating public healthcare agencies for patient care on the other.
This article contains four other sections in addition to the introduction and the conclusion: a brief contextualization of the use of surveillance and data technologies in the period of the pandemic; a description of the assessment method for the Monitora Covid-19 application; an analysis of the visible, discursive, and hidden materiality of the application; and an exploration of the application's role in the use and protection of personal data.
Pandemic Data
Forms of pandemic management (the need for detecting infected people, control of the contagion through territorial containment such as lockdowns and social isolation, monitoring and surveillance measures using digital media and other methods, etc.) seem to have a close relationship to the notion of governmentality (Foucault 2007), in a broader sense, and to the idea of state pastoral care as an instrument of political power in informing and directing the care of tracked citizens. Didier Bigo (2020: 2) corroborates this understanding (although without spelling out such a relationship with biopolitics) when questioning the dividuality of our data in such tracking and tracing networks:
It's doubtful if a state of "normality" would even be possible through a surveillance network of tracing applications. Would we really be able to regain our freedom of movement if that movement is under constant surveillance, governed by digital applications? As many security professionals currently seek to reimagine our future, will we let them treat us like herds in a pasture by coupling each person's biological identifiers with their digital identification?
The implementation of "smart surveillance" technologies reveals the problem of using personal data and puts at risk one of the pillars of modern democracies: the right to privacy and personal-data protection. It is, therefore, not just a question of ensuring the individual liberty of each citizen to "be let alone" (Warren and Brandeis 1890) in a sort of individual right that is negative and selfish (Westin 1967). The central question is to ensure a healthy balance between government action and the autonomy of individuals in the maintenance of social wellbeing, especially in the conditions of a pandemic. Full development of the democratic state, therefore, depends on the consideration of privacy protection as the ability of subjects to negotiate control of the flow of information concerning them (Doneda 2019) in the transition from a paradigm of secrecy to an environment of autonomy and control (Rodotå 2008).
Governments have become enthusiastic about technological solutions as a way of fighting the pandemic (Bigo 2020; Bunz 2020; Firmino and Evangelista 2020). Institutions concerned with digital activism in favor of human rights-such as Privacy International6 and Human Rights Watch'-emphasize the need for transparency and responsibility in their use. In light of the history of capitalism prospering from conditions of crisis-which Klein (2014) terms "shock doctrine" and "disaster capitalism"-critics warn of the potential of this occurring in the case of the new coronavirus pandemic. Kitchin (2020) points to the danger of a "covid washing," in which the pandemic would provide justification for the establishment and maintenance of abusive practices of digital surveillance and social control even after the crisis is over.
This type of criticism has directed the public debate, concentrating on the benefits and harm arising from the broad use of personal data in combating COVID-19. The figures are revealing and become quickly outdated with the constant search for solutions and urgent need to control the crisis. According to Woodhams (2020), data from July 2020 indicated the use of eighty contact tracing applications in fifity countries (35% using Bluetooth, 34% using GPS, and 24% using Bluetooth and GPS), other digital approaches without applications (such as cell phone data for monitoring social-isolation) in thirty-five countries, along with eleven countries using physical surveillance measures (facial recognition, drones, etc.).
While the actions comprising contact tracing are not new (dating back to other epidemics throughout the nineteenth and twentieth centuries) and despite similarities with the constraints of other methods-the more data about those infected, the greater the chances of producing efficient scenarios of mapping the virus spread-the possibilities for implementing mass processes of automatic data collection have never been greater. Some actions should therefore be envisaged to prevent initiatives of this magnitude from violating human rights and being transformed into mechanisms for social control and permanent mass surveillance. These are (Lemos and Marques 2020):
1. Respect for data protection laws such as GDPR (in Europe) or the Brazilian LGPD (Lei 13.709/18),8 strengthening an institutional culture of respect for privacy and data protection;
2. The creation of systems with "opt-in" and "opt-out" mechanisms (in which the user has to consent to data use and decide when to remove this consent) during their whole life cycles;
3. Clear identification of who has developed the system (state, private companies, public-private partnerships, etc.) in order to address duties and responsibilities;
4. Guarantees of data security (by using cryptography and certified anonymity systems, for example), identifying who can have access to these data, for how long, and for what purpose;
5. Algorithms that can be auditable (by specialists or the general public), ensuring transparency and revealing possible biases (race, gender, social class, nationality);
6. And the creation of public communication mechanisms, with broad discussion and information about the purposes and limits of initiatives, including the destination of collected data and when their use will cease.
Along the same lines, Morley et al. (2020) propose an ethical guide for contact-tracing applications, suggesting a list of sixteen questions for checking the need and justifications for the creation of applications. Recommendations from the European Convention on Human Rights indicate that to be considered ethical these types of applications need to meet four basic principles: they have to be necessary, proportional, scientifically valid, and time-bound (see Morley et al. 2020). Such parameters serve two purposes. On one hand, they assist governments and healthcare institutions in the proposal of ethical technological solutions from the perspective of the management of personal data. On the other hand, they function as scrutiny criteria for organizations concerned with the observation and protection of human rights.
A degree of suspicion towards these applications, as a form of generating audits and reviews, is fundamental. Although most countries have chosen a voluntary form of adopting the applications, and they have largely been developed according to "privacy-by-design" concepts (in accordance with the European GDPR), according to Bigo (2020: 9), this is no guarantee that privacy will not be affected and that discriminatory processes will be avoided:
That is why it must be stressed that current proposals for tracing applications must in no way be judged on their technical performances (including privacy design only). They must be judged by following a "citizen design" protocol, which in this case must start from a genuine clinical definition of health objectives. If we are not careful, Covid-19 applications could become routine surveillance measures for the purpose of overseeing compliance with the rules, including progressive deconfmement of a part of the population, as well as punishment for patients who do not declare themselves.
This article analyses the use of these types of technologies in Brazil, focusing on attention to personal data protection and user privacy. It is important to add that although Brazil is globally recognized as one of the first nations to propose a progressive, inclusionary, and participatory internet regulatory bill (Medrano 2015; Souza, Steibel, and Lemos 2017)-Brazil's Internet Bill of Rights, or Marco Civil da Internet in Portuguese-the creation of specific legislation on data protection has been delayed, falling behind other Latin American countries. The Brazilian LGPD only came into effect in September 2020. Implementation was postponed by the national congress with the justification that companies and institutions would find it difficult to adapt to the legislation due to the effects of the pandemic. This late adoption of policies and legislation explicitly concerning data protection relates to the fact that data protection has been disguised as privacy within public debate since approval of Brazil's current constitution in 1988. Pressure for a proper bill has increased more recently, when other countries in the region and around the world defined their own legal framework for data protection and began to demand compliance with such regulations for international businesses and commercial transactions. According to Doneda and Mendes (2019: 291-292):
Brazil was largely absent from data protection debates until a relatively recent period, apart from some exceptions such as bills proposed in the 1970s and 1980s that did not thrive. Even if the Federal Constitution of 1988 provides for the right to privacy as well as the Habeas Data action, there has not been a concrete movement in the country to receive trends in data protection, at least until the middle of the 2000s.
Around 2005, when the Brazilian government was asked to respond, in Mercosur forums, to a proposal for a regulation on the protection of personal data, a formal debate began, initially restricted to the Brazilian government, regarding a possible solution legislation on data protection for the country's planning.
It is also important to stress the extremely complex conditions of this country at the time of writing this article. Different levels of government (federal, state, and municipal authorities) are disputing initiatives for managing (or denying) the health crisis. Arguments range from a lack of enthusiasm about or attention to the gravity of the pandemic (principally on the part of the federal authorities) to the difficulties of implementing measures for restricting economic activities and physical gatherings (in some states and municipalities).
It is very hard to assess user desire for personal data protection in contexts such as a global pandemic. And, taking the case of Brazil, any type of simplified binary assumption (in favor either of institutional safeguards of the state or of protection of personal data) becomes inappropriate. In a study about the identification schemes in Brazil, Murakami Wood and Firmino (2009) concluded, for instance, that in many situations the fear of anonymity-meaning disappearance from the eyes of the state and consequent exclusion from welfare and social programs-is greater than concerns about privacy and personal data. Similarly, Arora (2019: 718) argues that "[gjiven the diversity of contexts and cultures, there is a possibility that most of the world may perceive, experience, and value privacy in unpredictable and varied ways, far from the Western constructs that frame privacy in terms of individual choice and data protection."
Besides, as we will argue, by critically assessing the manifold material layers that comprise data hungry products and applications such as Monitora Covid-19, it becomes clearer where the actual risks are, and what kinds of practices of governance can be put in place to ensure certain levels of privacy and data protection. Such assessment is carried out in this article through a neo-materialist lens.
Method
We analyzed the Monitora Covid-19 application according to a set of strategies inspired by the walkthrough method (Fight. Burgess, and Duguay 2018) and the pragmatic and neo-materialist approach being developed at Lab404/FACOM-UFBA (Fox and Alldred 2017; Lemos 2020). While the former can be applied to any application, the latter seeks to investigate the object based on a specific "mode," verifying it according to a debate, in this case the issue of data surveillance and the use and protection of personal data.
The walkthrough method provides a broad description of the object, as well as its vision, operational model, and governance, identifying the purpose of the application, ideal usage scenarios, and potential user base; the way in which it is managed and put into practice; and the model of regulating user actions. The neomaterialist methodology was developed based on four stages (mode, inventory, transduction, and aggregation) for an immanent analysis of the object in debate. Mode investigates the object according to a question of interest; it establishes a good proposition for producing a good discussion and felicity conditions of the analysis (Latour 2013). Inventory describes the network of objects (human and non-human) involved and how they are expressed and entangled (Barad 2007) (interfaces, documents, patents, forms of termination etc.). Transduction identifies the mediations, meanings, and forces that are produced in the agencies for production of the phenomenon, identifying what things do and what they make happen. Finally, aggregation offers a provisional result about the studied debate. The following stages were, therefore, developed according to those methodologies:
1. Selection of the Monitora Covid-19 application from the thirteen in operation in Brazil, and the inventory of agents that comprise this object-network (institutions, interfaces, companies, documents, users);
2. Analysis of the interface; identification of trackers and permissions and analysis of the official documentation (Privacy Policy/Terms of Use);
3. Analysis of official statements (press releases, videos, and other materials put forth by the institutions involved);
4. Interview with one of the technical-committee members for details about the application's implementation and usage;
5. Analysis of user reactions on Twitter, the Google Play Store, and the Apple App Store about the issue of personal-data protection.
It is important to note that the interviewee was appointed after we formally consulted official representatives of the larger Monitora Covid-19 project. Before and after the interview, the participant formally consented to its recording and subsequent use as research material. We deliberately restricted ourselves to just one interview since this particular individual was instrumental in putting together the Electronic Health Platform (iPeS) and the whole Monitora Covid-19 application.
Furthermore, we chose app stores for conducting user-reaction analysis for a number of reasons. Most importantly, we identified the Google Play Store and Apple App Store as the official and most relevant loci for interaction between users and developers, especially because they constitute a sort of public space for this very specific kind of communication. Since we have no access to private support channels, comments posted to the app's official pages on the Google Play Store and Apple App Store turned out to be the most efficient way of collecting this particular data.
Alternatively, Twitter was chosen due to its function as a general thermometer for measuring spontaneous user feedback. Even though Twitter might not be the most relevant social network currently operating in Brazil (compared to Facebook, for example), its design makes it more convenient for researchers to collect and analyze public discourse. Analysis of user reactions took place from April 3, 2020 to June 9, 2020 on Twitter; from April 9, 2020 to June 3, 2020 on the Google Play Store; and from May 8, 2020 to June 4, 2020 on the Apple App Store.
Visible, Discursive, and Hidden Materialities
A neo-materialist analysis presumes that the object of interest is constructed according to agencies and relationships. Monitora Covid-19 emerges based on the mediations of the various actors involved- including users, computer code, databases, public institutions, medical teams, and public communication (Gamble, Hanan, and Nail 2019). It exists according to radical mediations (Grusin 2015). Materialities are multiple and diverse-the production of the application as an object drives an extensive chain formed both by the lines of code compiled in Android and by interviews and official declarations of the stakeholders involved. Our interest here is in verifying specifically how the issue of privacy and data protection in the context of the pandemic is produced according to those multiple materialities.
Part of this materiality lies in the genealogy of the application itself, and understanding its history might contribute to positioning some of the actors and institutions involved. As we have seen, Monitora Covid-19 emerged as a response to the disaster caused by the new coronavirus and has taken the form we know today based on an agreement between public and private bodies. Two institutions stand out: Novetech, a start-up specializing in the development of technological products for the public health sector and the developer of Monitora Covid-19; and the technological wing of the Bahia State Family Health Foundation (FESF-tech) responsible for the prospection and development of innovative products and technological public health solutions in the state of Bahia. The agreement established between the two institutions enabled the expansion of Monitora Covid-19, initially to all municipalities in the state based on the provision of back-up medical staff (responsible for remote consultations with users) and the way in which personal data were managed and protected.
According to information obtained from an interview with a member of the FESF-tech technical committee, it was only after the adoption of Monitora Covid-19 by the public body (FESF) that measures and strategies for governance and personal data protection were put in place. This occurred mainly through the entanglement between Monitora Covid-19 and another FESF-tech product, the Electronic Health Platform (iPeS).9 Integration between Monitora Covid-19 and iPeS allows aggregation of pandemic data with data from the citizen's Electronic Health Registry and expands the forms of care offered by the public authorities.
iPeS was designed with a concern for personal-data protection (privacy by architecture or privacy by design). New products being developed and becoming part of iPeS need to comply with legal and technical standards for ensuring protection of personal data. In this sense, Monitora Covid-19's compliance to the LGPD and implementation of an encryption system were demands imposed by the FESF team and not something designed by Novetech beforehand. Safeguards for personal data resulted from the design of the iPeS system, into which the application was incorporated after its initial release.10
It is possible to materially assess the statement above by analyzing Monitora Covid-19's interface. The application interface shows little concern for issues related to privacy and data protection, revealing a gap between the initial interest of the application developers and iPeS governance requirements. Full use of the application is subject to registration based on an extensive input of data, such as full name, CPF social security number,11 SUS healthcare card12 (optional), date of birth, mother's name, sex, email, telephone, and full address.
During the registration process, the user is given no information about the need for provision of these data, or even their connection with the data governance provided by iPeS. Furthermore, the "Terms of Use" documentation only appears in a pop-up window after completion of registration.13 This is the only occasion when the user interacts with the documentation. Another problem is the absence of documentation concerning privacy policies, which is neither available in the application nor in the digital stores. The only way to find out basic data-use information is through the FAQ page.14
These weaknesses in the application interface design point materially to two issues. Firstly, they reveal the fact that the application has not necessarily been developed within data-protection frameworks and was later adapted to meet iPeS demands. Added to this is the fact that iPeS only very recently became widely available-there is not wide public knowledge about how it works. Although the iPeS logo appears on Monitora Covid-19 screens, little is still known about the platform and its data protection mechanisms. As we shall see, material-discursive layers of Monitora Covid-19, especially connected with public communication, make little effort to publicize iPeS and its data-governance system.
The application's key functionality is concentrated in the "How are you feeling now?" button, which prompts the user to provide yes/no answers to a series of questions and update their current health state in the system. The user is asked about comorbidities, regular use of medication, and other information that allow the system protocols to perform a general assessment of their state of health.
The user interface is perhaps the most visible material dimension of Monitora Covid-19, since that is how the user is most directly affected by the object. It therefore draws attention to a lack of communication concerning the mechanisms for safeguarding personal data, and a lack of transparency about care for these data. We might say that problems with the use of personal data are not so much due to requests for excessive information, or that they are not directly linked to the functioning of the application, but rather due to the lack of important information about data protection for the user throughout all stages of the registration process. The terms-of-use document offers this information only when the user has already completed registration, and the privacy policy document does not exist.
This suggestion is reinforced when we analyze a less visible material dimension of the application: the system of permissions for the application's access to certain user information.13 Google classifies permissions into two groups (normal and dangerous).16 Fourteen requested permissions by the application were identified on AndroidManifest, six of which are classified by Google Play itself as dangerous and eight as normal. Twenty-three sub-notified permissions were found, which do not appear on AndroidManifest but are referenced in the application source code, all of which are classified as dangerous.
The main vulnerability found in the application relates to the sub-notification of permissions from the "messages" group that allows the application to read SMS with sensitive information (passwords, bank information, or verification codes) and to monitor steps and/or movement without informing the user (through "activity recognition"). When requesting this type of information, the application has access to data about users' physical activity and location but can also gauge their movement behavior in the urban space. From a practical viewpoint, this type of senice would allow the application to monitor whether users diagnosed with COVID-19 are obeying the recommendations for social isolation, for example, which might help to improve public policies for controlling the spread of the disease and support publicity campaigns to raise public awareness. But this feature is not mentioned in the terms of use or in the official documentation and does not seem to be used to improve the performance or functionality of the application. This inconsistency resulting from the comparison between the app code structure and the content of the documentation on data security and privacy makes it difficult to exercise informed consent,17 as it demonstrates how users do not have enough qualified information about how the initiative deals with personal data.
The need for swift development may have involved the adoption of ready-made frameworks adapted to the initiative objectives. This might explain certain inconsistencies among the permissions expressed and those identified in the source code without being mentioned in AndroidManifest. This does not necessarily mean that such inconsistencies have a malicious purpose but, more importantly, point once again to the gap between the development process and later adaptation to a secure data governance. The situation is aggravated by the fact that, as we have mentioned, there are no initiatives for communication with users about protection, security, and ethical treatment of data. In this case, it seems more important to establish a trusting relationship between users and the institutions responsible.
Users are provided with sparse information, and their fear of the pandemic means that they simply trust the state that provides the application, supplying their data for an application that can help them in case of illness. Here we have a paradox or privacy dilemma (Kokolakis 2017; Barth and De Jong 2017; Li et al. 2016), similar to the "cybersecurity dilemma" proposed by Buchanan (2016). It occurs when users say they care about their personal data but provide them when they perceive any benefits in return, explaining here a relationship of tension between fear and trust. This is Buchanan's (2006) central argument in pointing out the "cybersecurity dilemma," such as fear and trust between states due to invasions of electronic systems. States must protect themselves and at the same time attack networks of other states, trying to maintain relationships of trust. As Buchanan (2016: 3) states:
to assure their own cybersecurity, states will sometimes intrude into the strategically important networks of other states and will threaten-often unintentionally-the security of those other states, risking escalation and undermining stability. This concept is hereafter referred to as the cybersecurity dilemma.... The security dilemma is the long-standing notion that states inevitably, though not deliberately, induce fear in other states as they secure themselves. As a result, these other states are likely to respond, seeking to reaffirm their own security but in the process unintentionally threatening others and risking still further escalation. The core tenet has proven robust and applicable to a wide range of circumstances throughout history.
Production of this trusting relationship leads us to another material dimension of the object, now more visible, which is the official discourse. As Monitora Covid-19 expands to more states and towns in the northeast of the country, public communications campaigns increase, aimed at mobilizing populations to adopt the application. When users come into contact with this material (official documents, press releases, advertising material, and so on), it tends to affect the trajectory and play a role in the construction of Monitora Covid-19, which is consolidated through use of the application.
Terms of use and FAQs are important documents in the construction of this material, and are the only legal documents accessible to users. Aspects related to data protection appear in several sections of the terms of use. It is emphasized that data are stored securely, encrypted, and anonymously in the iPeS platform, thus conforming to the LGPD. Nonetheless, the document does not specify how this occurs effectively.
Integration of data with iPeS, allowing insertion of these data into the citizen's electronic registry, seems to be fully justified in light of the character of the application. The main thing missing is a document about privacy policies.
One important condition singled out by the legal documents is users' commitment to surrender data. The document states that when sending any information through Monitora Covid-19 (framed as a "contribution" by the legal documentation) the user grants perpetual, royalty-free, and unconditional license to the institutions cooperating in the initiative. This attracts attention in light of the general discourse of analysts about the life cycle of pandemic data, which seems to be unanimous in arguing that all data used on this occasion should be discarded as soon as the pandemic is under control (Bigo 2020; Morley et al. 2020). Although the iPeS platform can provide security and secure public governance of these data, consideration needs to be given to the purpose and proportionality of maintaining pandemic databases in the future.
When we look at public communication products, such as press releases, institutional videos, and advertising material, privacy concerns seem not so evident. It does not seem a priority for communication departments to reassure the population about the use of their data, as if people were not concerned with this issue. Should users be unwilling to seek out the terms of use or FAQs, they will find it hard to know whether and how their personal data are secure.
Much of the institutional material is concerned with positioning Monitora Covid-19 as an important public initiative in combatting the new coronavirus, establishing use of the application as a civic duty in the face of a serious health crisis. It seeks to mobilize users' feelings of trust and community in confronting the pandemic. The only official press release that directly touches on the issue was produced by the Northeast Consortium Scientific Committee18 stating that:
The user can rest assured that personal data will be protected. The application was structured with all technological guarantees of privacy and security of the information provided, with the seal of scientists on the Scientific Committee.... Although the questionnaire is completed with personal information, these data remain anonymous. CPF information only needs to be provided in the case of remote consultation, as it is essential for the doctor to access the patient's SUS registration to check for comorbidities, medications and other health information in the patient's medical records. (Comité Científico do Consorcio Nordeste 2021)
In terms of public discourse, therefore, Monitora Covid-19 appears as a public tool for supporting the population in times of crisis, principally based on remote medical care. Data-protection discussion is not only absent from the official communication from institutions but is also strikingly missing from user feedback. Analysis of user comments on Twitter and official application stores19 (the Google Play Store and Apple App Store) proves spontaneous criticism of privacy to be in the minority, amounting to approximately 6% of comments, relating mainly to the amount of information required on registration, with some concern about the application's lack of a privacy policy.
The material-discursive layer therefore links Monitora Covid-19 both to some degree of legal security, since the official documentation (terms of use and FAQ) certifies correct use of personal data, and to civil security, since the public communication products seek to foster feelings of trust and citizen duty through advertising. Analysis of user impressions in both application stores and on Twitter indicate that when the issue of privacy is raised without encouragement it does not appear on the list of public concerns.20 Based on our analytical efforts, it can be stated that when it comes to the official discourse about Monitora Covid-19, privacy is grounded both on information about legal compliance and concern with the security of personal data and on the trusting relationship established between institutions and citizens.
Monitora Covid-19 as Object-Network
Evaluation of the visible, discursive, and hidden materialities of the application performed so far indicate the major privacy and personal data problems related to the interface, the official documents, discourse, the permissions and trackers, and user comments in application stores and on Twitter. Here we reveal the various actors that make up the "object-network" and how each one acts in the material construction and operation of the application in relation to the issue of personal-data protection. The materialities described previously begin to act relationally, and the role of each actor in the network becomes more evident in the composition of the "surveillance intelligence" of the Monitora Covid-19 solution.
Monitora Covid-19 stands as an object-network that draws together a broad network of action constructed by public authorities to provide the user with direct medical attendance and supply data for the construction of public programs for containing the pandemic. We should highlight that the application is therefore not an autonomous object in this "pastoral" protection solution, since it is presented as the most visible materiality of a network integrated to the four elements of greater prominence: iPeS and FESF-tech, Novetech, SUS, and the Northeast Consortium. Figure 1 presents a suggested composition of this network.
The data flow (Figure 2) shows integration of this network, with the application as the final point of contact with the user. Data provided by the user are recorded in the application, sent to Novetech, which forwards them to iPeS, generating actions on four fronts (medical team, healthcare authorities, the Northeast Consortium, and the Electronic Citizen Health Registry). Integration of Monitora Covid-19 with iPeS makes it possible to aggregate pandemic data with the Electronic Citizen Health Registry, expanding the possible forms of healthcare provided by the public authorities.
Data management by iPeS provides public governance and a firewall against predatory and/or malicious practices from companies offering data-based digital services. iPeS guarantees that the public agency will have control and lasting access to citizens' health data, imposing restrictions on public and private companies developing applications related to the protection of personal data.
There are, therefore, at least two points of greater vulnerability. On one hand, the private company that created the application, Novetech, has access to and management of some of the system data. Although it cannot legally extract value from the data collected by Monitora Covid-19 (selling to third parties, for example), technical vulnerabilities in the system could compromise information security. On the other hand, by trying to construct a personal-data protection regime with the fewest possibilities for private management and extraction, the system seems to overly rely on the public agency as a guarantor of security and privacy. Public agents do not always act towards ensuring the wellbeing of the population-Edward Snowden's 2013 revelations about how the US National Security Agency monitored US citizens under no prior suspicion can leave no doubt about that. A system with "public control" (such as regulatory agencies involving the participation of representatives from different sectors of civil society) would be more protected from attacks on network databases.
The new coronavirus pandemic has triggered several actions, such as the anticipation of iPeS implementation and the development of Monitora Covid-19. iPeS operates as a central mediator between the different data sources (other systems, applications, health records, etc.) and the state, operating as a large data hub that consolidates the Electronic Health Registry in a secure maimer. Information provided by a Northeast Consortium technology consultant reveals that iPeS was designed with concerns related to personal-data protection and respecting the LGPD. Monitora Covid-19's concern for privacy consequently results from the iPeS system and the demand for adaptation to LGPD as well as the implementation of an encryption system by the FESF team. Current management by the Northeast Consortium (including the state governments that are part of its scientific committee) and FESF can thus be stated, at least circumstantially, to have generated a data protection regime, particularly for health data, supported by the specific arrangement of iPeS/FESF/SUS.
This arrangement is able to create filters and sensitivity stamps for different types of data packages and thus determine different levels of security and data use for the various bodies connected to the system. Attention is drawn here to the lack of publicity given to this robust material combination for safeguarding and ethical use of data. Unlike the predatory conditions characteristic of surveillance capitalism, the case of Monitora Covid-19 demonstrates the importance-in a pedagogical sense as well-of opening the black boxes and demonstrating how mass use of personal data can take place based on responsible planning. This is the opposite of the paradox of transparency and opacity discussed by Pasquale (2015), for example.
Part of the action of Monitora Covid-19 is related to a big data operation that aims to help public administrators make decisions about combatting the pandemic. Establishment of the committee's "Situation Room" enables data coming from Monitora Covid-19 to become part of the scientific bulletins produced by state authorities as a way of proposing specific combat strategies based on algorithmic analysis. Data are analyzed algorithmically and medical care teams are alerted to make direct contact with the user. The user now figures not only as a suspected case needing state support but also as someone who has submitted their data to contribute directly towards the establishment of policies for combating the new coronavirus. This occurs in two ways: (1) providing relief to the public health system through the use of remote consultation and (2) through intelligence from the data collected by the registries. The success of the Monitora Covid-19 model consequently depends on adoption of the application.21
Monitora Covid-19 is, therefore, the product of a broad material network. Isolated analysis of the application does not allow an understanding of the transduction occurring between the most diverse actors in this network. It needs users willing to supply their data. These are processed by Novetech and directed by iPeS, which connects the data from that user to the Electronic Health Registry (hence the need for SUS data on registration, although that is not explained to the user). The Northeast Consortium consolidates the data into bulletins, reports, and charts, and informs governors and public administrators of the best strategies for confronting the pandemic. This is materialized into actions in the cities that provide a structure for a backup clinical team responsible for remote consultations of suspected cases registered in the application.22
The system functions from the outset based on the first input of user data. When the user is categorized as high or medium risk by the Monitora Covid-19 algorithm, the back-up medical team makes contact via telephone or the application chat service. That is the only way in which the user can activate the system.23 Actions are triggered when the user is classified by the algorithm. The user is directed to await contact from a healthcare professional. Communication is established with healthcare professionals. The pastoral cycle of state care (Northeast Consortium), indicated in section one, is completed from this stage, either with the conclusion of the consultation (in cases where suspected contagion is rejected) or with more in-depth monitoring based on tracking the movements of the person suspected of being infected.
Suspension and abandonment of the application occur as users stop reporting their symptoms. Termination of the relationship occurs simply through decline in use and eventually when the user decides to uninstall Monitora Covid-19. The interface has no data opt-out options. The user can request deletion of registration (via email) but not of data. This information does not appear in the terms of use or FAQs. There is no clear planning of the data life cycle.
Based on joint initiatives by the professionals involved and the public institutions of the network responsible for constructing the data protection scheme represented by iPeS, the application functions according to acceptable parameters for personal-data protection, regarding the national legislation about to come into force, as a device for implementation of public policies and care for the population. But there is no federal body to ensure that there is no abuse in the official request and governmental use of mass personal data.24 There are certainly problems with the application:
1. Lack of a privacy policy document;
2. Lack of explanation of terms of use before registration;
3. Vulnerability of data passing through the (private) company that developed the application;
4. Impossibility of opting out of data submission (you can only request exclusion from the register);
5. Vulnerability of SMS messages; analysis of the trackers and permissions revealed a vulnerability that allows the application to read SMS messages (which can contain sensitive information such as passwords, bank details, or verification codes) and the possibility of monitoring user movement. Neither are mentioned in the official documentation.
For the moment, and circumstantially, the network of guarantees created through the combination of FESF /iPeS/SUS and the Northeast Consortium as project administrator indicate by-design attention to aspects of personal-data protection in the establishment of Monitora Covid-19. But the primary question raised here concerns the fragile national structure for protection and privacy of personal data (Zanatta and Bioni 2020) and the consequent lack of regulatory bodies for smart-surveillance initiatives (in this case, epidemiological) that can supervise the application of (public and private) technologies with ethical guarantees and provide public control of the applications. As Zanatta and Bioni (2020: 1) have stated, "data protection is part of the Covid-19 vaccine."
Conclusion
The pandemic has generated justifications used by various authorities and technology companies for extensive monitoring of populations around the world, with the aim of containing infection and avoiding deaths and suspension of economic activities. In the case of Brazil, one of the epicenters of the pandemic, there are no coordinated initiatives on the use of smart surveillance from the main federal authorities. The most complete use of some type of technology for control of the contagion and care for those infected is the Monitora Covid-19 application implemented by states in the northeast of the country.
At the beginning of the pandemic, French and Monahan (2020) tried to anticipate some effects of the fight against COVID-19 on the field of surveillance studies. Unfortunately, in most cases, their predictions seemed right while indicating a scenario of increased monitoring targeted at more vulnerable populations, a proliferation of misinformation, and a widespread uncritical acceptance of surveillance technologies under the justification of improving control of contamination and healthcare needs. In this paper, we have added examples from Brazil to this account, from the erratic reactions of the government and its role in producing disinformation to the celebrated use of applications and mobile technologies to track the advance of the disease.
However, we have also revealed a more nuanced complexity in the socio-technical structure of the implementation of one of these applications in the Northeast of the country, where some levels of legal and technological safeguards were deliberately designed. We have done so by looking at the discursive and hidden materialities of Monitora Covid-19. As we have seen, this application was developed with a concern for the protection of personal data and for data security, but this was not the result of federal public control through institutions established for such purposes and instead was due to the awareness of its developers and the state institutions specifically involved in this case (which are not experts in matters of digital law).
We argue that allowing such initiatives to be controlled by the good sense of the developers is an inherent risk in this smart surveillance spreading across the world. Many different decisions need to be made during the course of development and, although they may seem trivial, some of them must be brought into public debate concerning social wellbeing. The fact that iPeS is hosted by Amazon Web Services (AWS), for example, may seem ordinary, but it might motivate discussions about the need for keeping sensitive personal data on servers located on home territory. It would be interesting if technical certification were in the hands of a state or federal authority concerned with digital information and communication technologies, which would oversee and approve the use of applications for mass and official use in relation to this issue.
However, there is also a point to be made regarding the right to erasure and the lifecycle of pandemic data. In the current scenario of possible justifiable reasons for applying surveillance techniques for containing loss of life, discussions concerning the "right to be forgotten" and "the right to erasure" add an essential layer of attention to the life span of pandemic data. Following Azevedo Cunha and Itagiba (2016), we differentiate the right to be forgotten from the right of erasure-the former concerning the individual's right to privacy against the public's right to information and the press's freedom of expression. In Brazil, for example, the Supreme Court (the highest judicial appellate court for constitutional issues) ruled in February 2021 that, in general, the right to be forgotten cannot displace freedom of expression and freedom of information as fundamental rights-the public cannot be denied its collective memory, as one of the judges declared.
The right to erasure, however, concerns inaccuracy, incompletion, or objection to the processing of personal data,25 which more directly concerns the Monitora Covid-19 app and the LGPD. There are three main arguments to be made here regarding the right to erasure and the lifecycle of pandemic data. First of all, the lack of a strong operating regulatory body-such as the ANPD-could represent a real danger of misuse and malicious appropriation of said data and of said type of legislation (cf. Arora 2019). Even though, as previously stated, Brazil has come a long way with implementation of LGPD, there is still a need for a robust data protection and ethics culture within our larger regulatory, legislative, and judiciary bodies-this is paramount for citizens to be able to assert their rights (be it to erase, complete, or correct) over their own personal data. Second, we found no evidence whatsoever of mass data-surveillance practices being adopted by public Brazilian authorities in the context of COVID-19, and there seems to be a lack of an irreversible data-monitoring ethos.
This is not an attempt to play down the importance of keeping a watchful and critical eye over how the pandemic data are being handled by the government (especially with the record-breaking performance in violations by the current far-right government). But it should be noted that data-monitoring techniques have barely been deployed at all, which indicates an absence of awareness from local authorities when it comes to the potentials of datafication for public health. And third, this is problematic, since a more extensive adoption of datafied systems-such as iPeS and Monitora Covid-19-could be heavily beneficial for public health authorities, especially when it comes to the strategic planning of countermeasures and tactical responses to the pandemic. It is not just a question of enacting individual liberty (such as the right to erasure or the right to be forgotten) over these datafied systems, as the cases discussed by Azevedo Cunha and Itagiba (2016) illustrate, but a commitment to the public good within an ethical data framework.
We consider this detailed and neo-materialist analysis of the Monitora Covid-19 application as an objectnetwork-which looked at some of its veiled techno-political constituents-to be part of an important conversation with other scholarly inputs to surveillance studies that are trying to understand the overlapping features of surveillance of disease and surveillance of populations (French and Monahan 2020; Newell 2021; Marciano 2021; Liu 2021; Monahan 2021). We argue our analysis adds to the efforts to build "a broader understanding of surveillance than what is typically given in public health discourse" (French and Monahan 2020: 4), as a contribution to a critical take on the "surveillance of the dis-ease" (French and Monahan 2020).
Periods of crisis (war, environmental disaster, pandemics, etc.) might even justify actions of mass digital biopolitics, but these systems need to be overseen and framed into strict technical timeframes and robust legal and social controls. It is essential to be alert to the risks of normalizing the use of personal data in exceptional conditions and the irreversible expansion of intelligent mass surveillance. The possibility of controls with ethical criteria for justifying the employment of these technologies, as proposed by Morley et al. (2020), therefore, seems fundamental in all phases of these types of projects, from consideration of the need for development, through design, implementation, and subsequent supervision. Floridi (2020: 4) argues, "if it turns out that one cannot build it rightly, perhaps one should not build it at all in the first place."
1 With an object-centered analysis, this article does not intend to present consumer views. The analysis is immanent to the object, aiming to identify how data surveillance and privacy issues are produced by the device independent from user inspection. Consumer viewpoints are important-and we recognize that further research is needed-but that is not the aim of this paper.
2 It is interesting, and tragic, to see the timeline of deaths in Brazil compared to the life cycle of this very article. Its first draft upon submission to the journal announced the mesmerizing number of 100.000 deaths and 2.7 million cases. Three months later, the first revised draft was updated with 200.000 deaths and 8.5 million cases. And ten months later, official reports show the unbearable numbers presented here, and counting.
3 After this article was written, the federal government implemented a contact-tracing function in the CoronavirusSUS application (https://www.gov.br/pt-br/apps/coronavirus-sus). There are still no indicators pointing to mass adoption or the effectiveness of this feature. Although there has been a high number of downloads, it has not been integrated into a single action strategy in combatting the pandemic. Many technical problems have been reponed by users after implementation of this feature. For these reasons, the most imponant application in use in the country still is Monitora Covid-19.
4 See http://www.novetech.com.br/.
3 According to information obtained from an interview with a member of the technical team, an investment of approximately 25.000 USD is estimated to be required for the operation of Monitora Covid-19 and its supporting systems by the end of 2020. It should also be emphasized that the application has been provided for use at no cost to public bodies and, according to the interview, much of the effort put into implementation of the initiative is voluntary.
6 See https:/Avww.privacvintemational.org examples/apps-and-covid-19.
7 See https://www.hnv.org news/20.
8 The General Data Protection Law (LGPD) is the Brazilian equivalent of the European GDPR. which inspired it.
9 iPeS is developed and managed by FESF-tech as a platform of public governance of healthcare data associated with the Single Healthcare System (SUS). Its function is to consolidate citizen health data from multiple sources (public and private) securely and ensuring data protection. Implementation of iPeS was brought forward thanks to the public disaster conditions caused by the new coronavirus pandemic.
10 It should be noted that even though the Monitora Covid-19 app is now compliant with iPeS's data protection requirements, we cannot firmly state that the app is not connected with other public and/or private data sources and/or data brokers. It is possible, but not likely, that the original Monitora Covid-19 developer (a startup under the name Novetech) managed to connect the app's database with third-parties or business partners (public or private). However, that could be considered a violation of iPeS terms of conduct.
11 CPF-Ccidastro de Pessoa Física (Individual Persons Registry)-is the main identification number for Brazilians.
12 SUS-Sistema Único de Senide (Single Healthcare System)-is the Brazilian public health system and is considered to be one of the largest in the world. SUS serves more than 70% of a population of 209 million inhabitants.
13 Available at:<https://www.dropbox.eom/s/9ppccfzwmz 1 bvld/termosdeusoMonitoraCovid> 19.pdf?dl=0.
14 See https://monitoracovidl9.com.br/fao.
15 The permissions system is designed to give the user greater control of the information that can be accessed by applications, operating as a system that checks which data are tracked by the application and which permissions are used by the user's device. Our analysis was based on the APKPenn script (https://github.com/INCTDD APKPerml-a tool developed in Python that allows automated reading of the internal structure of applications.
16 Normal permissions are considered to be lower risk and allow the applicant access to isolated resources for the application's operation, with minimal risk to other applications, the system, or the user. This type of permission is granted automatically to the requesting application on installation, without requiring prior and explicit user approval. Dangerous ones pose a potential risk to user security by providing access to personal data or control over the device. In this case, the system requests user approval for granting permission.
17 In short, informed consent refers to the autonomous right of individuals to make decisions based on as much information as possible about the implications of their choices. Even though informed consent can usually be regarded as good practice in terms of data governance, it should be noted that it is not enough to guarantee ethical treatment of data. As Bioni (2020) states, there are several limits for informed consent when it comes to data protection, especially in a pandemic context.
18 See https://www.coinitecieiitifico-ne.com.br/imprensa.
19 Looking at the comments on the applications stores and Twitter, the most feedback can be found on the Google Play Store, with 482 reviews to date. No tweets criticize or even mention the issue of privacy or the use of personal data. We therefore concentrate on the Google Play Store.
20 As we explained earlier, with the neomaterialist approach used here, we seek to identify which privacy problems are addressed by the networks (devices, documents, interfaces, comments) in which the Monitora Covid-19 application is a central object. For example, even if a poll indicated that users are not concerned with privacy issues (as we saw in the comments), we could say that the device reveals problems regarding this issue. Our goal is to point out the problems emerging from the device, rather than users" perceptions of the problems. In the network's approach, the issue of privacy is not just a matter of perception but of the emergence of a wide network. Future research should include a survey with users to complete this picture.
21 As this is a public system, there is no strategy designed for creating profit. The application is distributed free of charge. There is no evidence that data or intelligence produced by the application are commercialized in any way.
22 The operating model is particular to each place, and also in relation to the training of teams. FESF carries out the training and establishes attendance protocols, which are standardized among different teams and also with regard to ethical treatment of data. The structure and formation of these teams, however, is the responsibility of the local health authority, which makes individualized decisions according to the conditions of each city and state in the Noitheast Consortium. It is possible for certain municipalities to have access to Monitora Covid-19 data in their city without necessarily providing a back-up medical team. In such cases, they have access only to geo-referencing of the cases. Sensitive personal data (comorbidities, symptoms) and demographics are not available.
23 Unless false information is entered at registration, declaring more serious symptoms and forcing faster consultation. However, there is no evidence of this practice occurring.
24 This function would be partially performed by the National Data Protection Authority (ANPD), established as a regulatory agency separate from LGPD. With the postponement of LGPD. there is still no prospect of implementation of the ANPD.
25 As stated by Azevedo Cunha and Itagiba (2016), it is important to point out that the General Data Protection Regulation (GDPR) provides the right to erasure and the right to be forgotten as a single right-the right to erasure constitutes the conditions for enactment of the right to be forgotten.
References
Arora, Payal. 2019. General Data Protection Regulation-A Global Standard? Privacy Futures, Digital Activism, and Surveillance Cultures in the Global South. Surveillance & Society 17 (5): 717-725.
Azevedo Cunha, Mario, and Gabriel Itagiba. 2016. Between Privacy, Freedom of Information and Freedom of Expression: Is There a Right to Be Forgotten in Brazil? Computer Law & Security Review 32 (4): 634-641.
Barad, Karen. 2007. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham, NC: Duke University Press.
Barth, Susamie, and Menno D.T. De Jong. 2017. The Privacy Paradox: Investigating Discrepancies between Expressed Privacy Concerns and Actual Online Behavior: A Systematic Literature Review. Telematics and Informatics 34 (7): 1038-1058.
Bigo, Didier. 2020. Covid-19 Tracking Apps, or: How to Deal with a Pandemic Most Unsuccessfully. About Intel: European Voices on Surveillance, June 3. https://aboutintel.eu'covid-digital-tracking [accessed Aug 20, 2020].
Bioni, Bruno Ricardo. 2020. Proteçâo de dados pessoais: a funçâo e os limites do consentimento. 2nd ed. Rio de Janeiro, BR: Editora Forense.
Buchanan, Ben. 2016. The Cybersecurity Dilemma: Hacking, Trust, and Fear Between Nations. Oxford, UK: Oxford University Press.
Bunz, Mercedes. 2020. Contact Tracing Apps: Should We Embrace Surveillance? Mercedes Burn, April 29. https:/,mercedeshiin7.net2020/04,29/630 [accessed August 20, 2020].
Comité Científico do Consorcio Nordeste. 2021. Aplicativo Monitora Covid-19. Projeto Mandacaru, Release 5, July 21.
Doneda, Danilo. 2019. Da Privacidade á Proteçâo de Dados Pessoais: Elementos Da Formaçâo Da Lei Gerai de Proteçâo de Dados. 2nd ed. Sao Paulo, BR: Thomson Reuters Brasil.
Doneda, Danilo, and Laura Scheitel Mendes. 2019. A Profile of the New Brazilian General Data Protection Law. In Internet Governance and Regulations in Latin America, edited by Luca Belli and Olga Cavalli, 291-305. Rio de Janėno, BR: FGV Direito Rio.
Firmino, Rodrigo, and Rafael Evangelista. 2020. Modes of Pandemic Existence: Territory, Inequality, and Technology. In Global Data Justice and COVID-19, edited by Linnet Taylor, Gargi Sharma, Aaron Martin, and Shazade Jameson, 100-108. London: Meatspace Press.
Floridi, Luciano. 2020. Mind the App: Considerations on the Ethical Risks of COVID-19 Apps. Onlife, April 18. https: thephilosophvofinformation.blogspot.com'2020 04 mind-app-considerations-on-ethical.html [accessed August 20, 2020].
Foucault, Michel. 2007. Security, Territory, Population: Lectures at the College de France 1977-1978. Basingstoke, UK: Palgrave.
Fox, Nick, and Pam Alldred. 2017. Sociology and the New Materialism: Theory, Research, Action. Sociology and the New Materialism: Theory, Research, Action. London: SAGE Publications.
French, Martin, and Torin Monahan. 2020. Dis-Ease Surveillance: How Might Surveillance Studies Address COVID-19? Surveillance <? Society 18 (1): 1-11.
Gamble, Christopher N, Joshua S Hanan, and Thomas Nail. 2019. What is New Materialism? Angelaki 24 (6): 111-134.
Grusin, Richard. 2015. Radical Mediation. Critical Inquiry 42 (1): 124-148.
Kitchin, Rob. 2020. Will CovidTracker Ireland Work? The Programmable City, April 29. http: progcitv.mavnoothuniversitv.ie 2020/04/will-covidtracker-ireland-work [accessed August 20,2020].
Klein, Naomi. 2014. The Shock Doctrine: The Rise of Disaster Capitalism. London: Penguin Books Limited.
Kokolakis, Spyros. 2017. Privacy Attitudes and Privacy Behavior: A Review of Current Research on the Privacy Paradox Phenomenon. Computers and Security* 64: 122-134.
Latour, Biuno. 2013.Ап Inquiry into Modes of Existence. Cambridge, MA: Harvard University Press.
Lemos, André. 2020. Epistemologia da Comunicaçâo, Neomaterialismo e Cultura Digital. Galaxia (Sāo Paulo) 404 (43): 54-66.
Lemos, André, and Daniel Marques. 2020. Vigilancia Guiada por Dados, Privacidade e Covid-19. Dossiéln Vitro, Lab404/UFBA, March 14. http:/,'www. Iab404-ufba.br/vigilancia-guiada-por-dados-privacidade-e-covid-19/ [accessed August 20, 2020].
Li, Han, Xin Luo, Jie Zhang, and Heng Xu. 2016. Resolving the Privacy Paradox: Toward a Cognitive Appraisal and Emotion Approach to Online Privacy Behaviors. Information & Management 54 (8): 1012-1022.
Light, Ben, Jean Burgess, and Stefanie Duguay. 2018. The Walkthrough Method: An Approach to the Study of Apps. New Media and Society 20(3): 881-900.
Liu, Chuncheng. 2021. Chinese Public's Support for Covid-19 Surveillance in Relation to the West. Sun·eillance & Society 19 (1): 89-93.
Marciano, Avi. 2021. Israel's Mass Surveillance During Covid-19: A Missed Opportunity. Sun·eillance *& Society 19 (1): 85-88.
Medrano, Maria. 2015. Brazil's Internet Bill of Rights'Americas Quarterly 9 (2): 99-102.
Monahan, Torin. 2021. Reckoning with COVID, Racial Violence, and the Perilous Pursuit of Transparency. Surveillance & Society 19 (1): 1-10.
Morley, Jessica, Josh Cowls, Mariarosaria Taddeo, and Luciano Floridi. 2020. Ethical Guidelines for COVID-19 Tracing Apps. Nature 582 (7810): 29-31.
Murakami Wood, David, and Rodrigo Firmino. 2009. Empowerment or Repression? Opening up Questions of Identification and Sun·eillance in Brazil through a Case of "Identity Fraud." IDIS 2: 297-317.
Newell, Bryce Clayton. 2021. Introduction: Surveillance and the COVID-19 Pandemic: Views from Around the World. Sun·eillance & Society 19 (1): 81-84.
Pasquale, Frank. 2015. The Black Box Societ)-. Cambridge, MA: Han'ard University Press.
Rodotå, Stefano. 2008. A Vida Na Sociedade Da Vigilancia: A Privacidade Hoje. Rio de Janeiro, BR: Renovar.
Souza, Carlos Affonso, Fabro Steibel, and Ronaldo Lemos. 2017. Notes on the Creation and Impacts of Brazil's Internet Bill of Rights. The Theory and Practice of Legislation 5 (1): 73-94.
Srnicek, Nick. 2017. Platform Capitalism. New York: John Wiley & Sons.
Trottier, Daniel, Qian Huang, and Rashid Gabdulhakov. 2021. Covidiots as Global Acceleration of Local Surveillance Practices. Sun·eillance <? Society 19 (1): 109-113.
Warren, Samuel D, and Louis D Brandeis. 1890. The Right to Privacy. Han'ard Law Review 4 (5): 193-220.
West, Sarah Myers. 2019. Data Capitalism: Redefining the Logics of Surveillance and Privacy. Business and Society 58 (1): 2041.
Westin, Alan. 1967. Privacy and Freedom. New York: Ig Publishing.
Woodhams, Samuel. 2020. COVID-19 Digital Rights Tracker. ToplOvpn, July 3. https://www.topl0vpn.com/research/investigations/covid-19-digital-rights-tracker/ [accessed August 20, 2020].
Zanatta, Rafael, and Bnmo Bioni. 2020. Protec&acaron;o de dados faz parte da vadna contra Covid-19. Jota, May. https://www.jota.info opiniao-e-analise/artigos protecao-de-dados-faz-parte-da-vacina-contra-covid-19-04052020 [accessed August 20, 2020]
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: Pub lie Affairs.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022. This work is published under https://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
This article discusses smart surveillance based on the particular case of the Brazilian mobile app Monitora Covid-19 from the perspective of issues related to personal-data protection. Brazil is today one of the epicenters of the pandemic. The application under analysis is the tip of a wide network of data monitoring and medical assistance formed by public and private institutions. Based on a neo-materialist analysis of this network, this article discusses the use of surveillance technologies and data during the period of the pandemic; describes and comments on visible, discursive, and hidden materialities; and indicates the main issues of the application in the use and protection of users' personal data. In conclusion, it indicates some of the application's flaws in relation to personal-data protection. More broadly, it reinforces the need for the creation of publicly controlled regulatory bodies for smart surveillance systems that are able to oversee the application of (public and private) technologies with ethical assurances and public control.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 Federal University of Bahia, Brazil
2 Pontifícia Universidade Católica do Paraná, Brazil
3 Fundação Getúlio Vargas, Brazil