ABSTRACT This contribution aims to address risks and opportunities for cultural diversity resulting from platforms' personalisation tools based on a legal analysis of the main provisions of the European Union General Data Protection Regulation (GDPR) on consumer profiling and automated decisions, as well as a sample of data protection policies of selected streaming platforms. It examines how the latter may in practice affect the protection of consumers' personal data for the purpose of recommending personalised audio-visual and music content online and how such provisions relate to the discoverability of a diversified cultural offer online and, at the European level, the obligation for platforms to give prominence to European works in their catalogues. The paper shows that a lot may still be done to improve the transparency of algorithms used for personalisation purposes and to provide users with greater control of their data, as required by the GDPR.
KEY WORDS
CULTURAL DIVERSITY, GDPR, PROFILING, AUTOMATED DECISIONS, PERSONALISATION, DISCOVERABILITY
SAŽETAK Članak se bavi pitanjima rizika i prilika za kulturnu raznolikost koji proizlaze iz alata za personalizaciju platformi, a temelji se na pravnoj analizi glavnih odredbi Opće uredbe o zaštiti podataka Europske unije (engl. GDPR) o profiliranju potrošača i automatiziranim odlukama, kao i na analizi politika zaštite podataka odabranih streaming platformi. U članku se ispituje kako potonji mogu u praksi utjecati na zaštitu osobnih podataka potrošača u svrhu preporučivanja personaliziranih audiovizualnih i glazbenih sadržaja na internetu te kako se takve odredbe odnose na otkrivanje raznolike kulturne ponude na internetu i, na europskoj razini, na obvezu platformi da istaknu europska djela u svojim katalozima. Zaključuje se da još mnogo toga mora biti učinjeno kako bi se poboljšala transparentnost algoritama koji se koriste u svrhu personalizacije te kako bi se korisnicima omogućila veća kontrola njihovih podataka, kako to zahtijeva GDPR.
KLJUČNE RIJEČI
KULTURNA RAZNOLIKOST, OPĆA UREDBA О ZAŠTITI PODATAKA (GDPR), PROFILIRANJE, AUTOMATIZIRANO ODLUČIVANJE, PERSONALIZACIJA
INTRODUCTION
In order to determine the personalised content recommendations that are considered to reflect users' preferences, and that are put forward to each user when they connect to their account, audio-visual streaming platforms process users' data. Through the exclusive application of recommendation algorithms (and, therefore, without human intervention), audio-visual and music streaming platforms personalise the homepage of each user as part of their services. While personalisation responds to the need to filter an increasing volume of information and content that may sometimes be overwhelming for users, the lack of human intervention in the personalisation process makes such process opaque and not easily understandable by most users.
In addition, such personalisation decisions made by platforms may have significant effects on users from a cultural diversity standpoint (Richieri Hanania & Norodom, 2016), "cultural diversity" being understood as "the manifold ways in which the cultures of groups and societies find expression", in accordance with Article 4.1 of the UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions. Usually based on previously expressed preferences of content, personalisation algorithms tend to lock users into their earlier choices, sometimes perpetuating stereotypes and the polarisation of views (see, for instance, Burri, 2016), and also preventing users from discovering new and culturally diverse content which reflects the cultural richness of our planet.
This paper aims to analyse, from a legal perspective, four global streaming platforms' automated personalised recommendation systems and their data privacy policies in light of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC ("General Data Protection Regulation", hereinafter "GDPR"). In line with Article 288 of the Treaty on the Functioning of the European Union, the GDPR, as a European Union (EU) Regulation, is directly applicable in all EU Member States. It applies to the processing of personal data of data subjects in the EU, even if the processing does not take place in the EU (Article 3 of the GDPR). The streaming platforms analysed in this paper were selected due to their global reach and relevance in the EU market. Since they process personal data of users established in the EU, they are subject to the GDPR and have adapted their privacy policies to comply with this Regulation.
This paper inquires into the way personalised recommendations by such platforms may affect the protection of users' personal data, as well as users' online content consumption and their "discoverability" of diverse online cultural content, understood as the easiness for users to come across new and diverse content amidst the tremendous amount of content available online. The discoverability of diverse cultural content online implies not only that diverse content is made available (diversity in supply), but also that users are able to effortlessly access such content (diversity in consumption) (see, for instance, Burri, 2016; Ochai, 2022, p. 115; Richieri Hanania & Norodom, 2016).
From a legal assessment viewpoint, the GDPR provides a protective framework for the processing of personal data that allow for the creation of profiles to better understand the personality, habits, and consumer preferences or, more generally, the behaviour of consumers. Article 4.2 of the GDPR defines largely the "processing" of personal data as
any operation or set of operations which is performed upon personal data or sets of personal data, whether or not by automatic means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction.
Profiling is specifically defined in Article 4.4 of the GDPR as referring to:
any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
Through profiling, an individualised profile is constructed on the basis of the personal data collected on a person. This may be done by a fully automated decision, or through a partially automated decision, when the latter is also accompanied by human intervention. Since the evaluation of certain personal data is part of the very definition of profiling, it requires that a certain judgement be applied to a person, i.e., that the data collected are used to draw conclusions about that person, whether to make a decision about him or her or not.
Useful clarification on the legal regime applicable to profiling may be found in the guidance on profiling and automated decision-making by the "Article 29 Data Protection Working Party" established by Article 29 of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. This advisory and independent group adopted "Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679" on 3 October 2017 as part of its work on the implementation of the GDPR and such guidance was subsequently revised and adopted on 6 February 2018. The Working Party explains profiling as "a procedure which may involve a series of statistical deductions. It is often used to make predictions about people, using data from various sources to infer something about an individual, based on the qualities of others who appear statistically similar." (Article 29 Data Protection Working Party, 2018, p. 7).
By setting out obligations for companies using profiling tools, on the one hand, and specific rights for the individuals whose personal data are used for profiling purposes, on the other, the GDPR aims to limit the risks arising from erroneous analysis of personal data by automated profiling mechanisms. It considers that the risks resulting from profiling are increased when decisions are fully automated, i.e., when they are exclusively made by algorithms applied to collected personal data without the involvement of any human intervention in such a process. Therefore, it provides for a set of rules applicable to profiling and automated decisions, which are supplemented by specific provisions when these decisions are exclusively automated. The main objective is to ensure that neither the decisions made as a result of automated processing or profiling, nor the collection of data for the creation of profiles and the application of these profiles to individuals, have an unjustified impact on users' rights.
This paper examines, from a legal standpoint, the main provisions of the GDPR on consumer profiling and automated decisions, as well as a sample of data protection policies from selected streaming platforms, in order to provide a legal opinion on how these policies affect in practice the use of consumers' personal data for the purpose of recommending personalised audio-visual content online. It then examines how such obligations relate to the discoverability of a diversified cultural offer online, and notably to the obligation for platforms targeting European users to give prominence to European works in their catalogues in accordance with the revised Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audio-visual media services ("Audio-visual Media Services Directive", hereinafter "AVMSD"), in order to address the risks and opportunities of platforms' profiling tools as regards cultural diversity.
PROFILING AND DATA PRIVACY PROVISIONS
Profiling is at the heart of the services provided by audio-visual and music streaming platforms. Receiving well-selected personalised music or audio-visual content is part of users' expectations when they subscribe to the services offered by streaming platforms. Yet, this entails several obligations for such platforms from a data privacy perspective, as a reflection of important users' rights that should not be overlooked.
Platforms' obligations with respect to profiling
The principles of Article 5 of the GDPR are applicable to streaming platforms as to any other personal data controller. First, the principle of transparency of data processing acquires particular importance in the case of profiling, which is a rather complex process that is often invisible and hardly understood by most users. Like other data controllers, streaming platforms must provide their users with concise, transparent, intelligible, and easily accessible information on what data are collected, how they are processed, what automated mechanisms are applied, according to what underlying logic and with what consequences, including profiling and decisions made on the basis of the profile created (e.g., the proposal of personalised content recommendations).
Other principles applicable under the GDPR include the lawfulness and fairness of data processing (e.g., data processing, including for profiling purposes, must not create discrimination), as well as purpose limitation (e.g., profiling must not use data originally collected for other purposes not initially intended by users). Data processing is lawful only if it is consented to for specific purposes. Another applicable principle is that of data minimisation (data processed must be adequate, relevant, and limited to what is necessary to achieve the consented purposes). Users must be clearly informed of the reasons for the collection of their data and the data must be processed, as much as possible, in an aggregated, anonymised or - if sufficient protection is provided - pseudonymised manner, when profiling takes place.
The data collected for profiling must also be accurate, or else they may lead to erroneous predictions or conclusions about a user's behaviour or preferences. Processing itself for profiling purposes seems particularly prone to error, since it involves inferences, data taken out of context and combined to produce predictions. Attention must, therefore, be paid not only to the accuracy of the data used, but also to any hidden biases that may apply when algorithms process that data, which may naturally result from the fundamentally human perception of the data scientists who create those algorithms. This increases the importance of the information that needs to be provided to users, in order to allow them to correct and/or to improve the quality of the data collected, even and especially when it is collected indirectly. Similarly, the processed personal data are subject to a principle of limited retention and may, therefore, only be kept for as long as necessary for the intended purposes, even though this may counter the economic interests of the platform, since the machine learning process is essentially enriched by a continuously growing volume of data.
The obligation to inform users must apply when a user subscribes to or creates an account to use an audio-visual or music streaming platform offering personalised recommendations, but also when the platform's privacy policies are updated, and at any other time upon the user's request. As an example, Google states that "[i]f changes [to the privacy policy] are significant, we'll provide a more prominent notice (including, for certain services, email notification of Privacy Policy changes)." (Google, 2022b). Similarly, for Netflix, a notice is provided to users in case of a change in Netflix's Privacy Statement, and the continued use of the services after such a notification constitutes the acknowledgment and acceptance of the new terms. If a user does not wish to accept such updates to the privacy policy, the only option is to cancel the use of the services (Netflix, 2021). Regarding changes to the rules on the protection of personal data, Disney+ indicates that users will be notified of such changes "if these changes are material" (which remains open to discussion) and users' consent will be sought "where required by applicable law" (Disney+, 2021).
Automated processing of data for profiling purposes depends on the informed consent of the user and the data controller must be able to demonstrate that the user understands exactly what they are consenting to. Arguably, a high degree of transparency of the criteria used by the algorithms and applied to the data collected by streaming platforms is therefore required. This does not imply that streaming platforms should provide a complex explanation or full disclosure of the algorithms used - which most people would probably not be able to understand in any case, since algorithms have become increasingly complex. Nevertheless, what the data scientists take into account when programming algorithms (including the categories and segments used to define users' profiles) must be translated into a simple and intelligible form for users.
Such high level of transparency and information still seems to be insufficiently respected by the sample of streaming platforms whose privacy policies have been analysed for this contribution (Netflix, YouTube, Disney+ and Spotify), despite some manifest improvement in the last year. To start with, these policies are far from concise and generally difficult to understand for a user without any legal training. Among the four platforms, Google (YouTube) stands out for its efforts to simplify explanation, with the use of entertaining videos and easier-to-read text presentation, although the text remains long, scattered across a multitude of different and sometimes complex-to-navigate sites.
As an illustration, from the Google Privacy Policy webpage (Google, 2022b), a user may choose to do a "Privacy Check-up" which takes him or her to another page that allows, among other things, to manage the user's "YouTube History". The user may choose to save (or not) the YouTube videos he or she watches, as well as the terms searched for on YouTube, in order to have recommendations and be reminded of where he/she left off. The Google Privacy Policy also contains short videos that explain what information is collected and why (including, inter alia, the personalisation of content and advertising), as well as the technologies used to collect and store information (e.g., cookies, pixel tags, web browser storage, application data caches, databases, server log files). Information on some of the criteria used by YouTube's recommendation algorithms was recently added to Google's Privacy Policy, with the explicit exclusion of "sensitive categories, such as race, religion, sexual orientation, or health" from the criteria used for the personalisation of advertising (Google, 2022b). The same is not stated regarding content recommendations though.
Netflix's Privacy Statement (Netflix, 2021) outlines the information collected automatically, including user activity (title selection, searches, movies viewed), user interactions via emails and other messages received from Netflix, and general location data. In the section on the use of collected data, Netflix explains that personal data are used, among other things, to provide personalised recommendations of films or series considered to be of a user's interest. The user cannot opt out if he or she wishes to subscribe to the platform's services. A paragraph was added in the November 2021 version of Netflix's Privacy Statement, which explains that Netflix's personalisation system aims to predict what users are in the mood to watch, but "does not infer or attach socio-demographic information (like gender or race)" to a user as part of the algorithm decision-making process. A separate link was also added to provide the user with more information on how Netflix's recommendation system works (Netflix, 2022). It offers an overview of the factors taken into account to determine what a user is expected to enjoy, such as the user's viewing history and how he/she rated other titles, information about the titles (e.g., genre, categories, actors, release date), titles watched by other members with similar tastes and preferences, time of day of activity, and for how long a user watches content streamed by Netflix. Except for the explicit exclusion of "demographic information (such as age or gender)", information on the specific categories or segments considered in profiling is not detailed.
Another interesting example may be found in the Disney+ Privacy Policy (Disney+, 2021) and Cookies Policy (Disney+, 2022). The deletion of "Flash cookies," indicated as being responsible for storing users' preferences, is discouraged: "[i]f you disable Flash cookies, you won't have access to many features that make your guest experience more efficient and some of our services will not function properly" (Disney+, 2022) and, without the collection of personal information, Disney+ may not be able to deliver certain experiences, products and services, and to take a user's interests and preferences into account. With respect to the logic behind the application of recommendation algorithms, and although it is stated that users' preferences, usage patterns, and location are collected, no further detail is provided on how these criteria are combined, nor on which categories the profiles created are based (Disney+, 2021).
As for Spotify, its privacy policy (Spotify, 2021) provides tables that are perhaps less easy to read, but that offer more detail on the data collected than the other platforms reviewed. It describes "usage data", which are said to include not only information on searches, tracks listened to, playlists and browsing history, but also inferences drawn from the user's interests and preferences based on their use of Spotify, as well as the user's "general (non-precise) location" to enable Spotify to comply with licensing agreements according to geographical areas and to provide personalised content and advertising. The purpose of the use of the data collected is explicitly stated to be to provide and personalise Spotify services.
The lack of explanation on the criteria used by the algorithms and the categories into which users are placed for profiling purposes seems all the more important as these explanations are fundamental when addressing concerns about cultural diversity and the discoverability of new content proposed by the recommendations of these platforms (for example, Benhamou, 2016; Burri, 2019; Napoli, 2019; Richieri Hanania & Norodom, 2016).
Another important question relates to the legal basis for data processing by audiovisual and music streaming platforms. Could recourse to another legal basis than the explicit consent of the user possibly justify that the user receives less information on the processing of his/her data, or even with less regularity? Could another legal basis be envisaged in view of the business model of these platforms? Indeed, another acceptable legal ground under the GDPR for data processing (in addition to the consent provided by the user) is that of processing that is "necessary for the performance of a contract." As the concept of "necessary" should in principle be interpreted narrowly (Article 29 Data Protection Working Party, 2018, p. 13), one could assume that, while the operation of these platforms is based on personalised recommendations, the performance of the streaming service itself does not necessarily depend on such recommendations.
In practice, however, non-acceptance of the terms of service and data privacy policies established by these platforms naturally implies non-use of their services. Netflix expressly states the performance of their service contract with each user among the legal bases for the collection and use of personal data (Netflix, 2021). From this point of view, personalised recommendations are considered essential to the delivery of the platform's services. On the page regarding its recommendation system, Netflix clearly defines its business as "a subscription service model that offers personalized recommendations." (Netflix, 2022) As for Spotify, a video on personalisation linked to the Spotify Privacy Policy (Spotify, 2022) starts by defining Spotify as a "personalised audio service" and explains in non-legal language how information collected using Spotify's services (songs played, playlists created) leads to personalised suggestions from Spotify (Spotify, 2021). Although choices on streamed content may be expressed by users according to section 2 of Spotify Privacy Policy, personalised services are listed as one of the purposes of data processing by Spotify, based not only on users' consent, but also on the performance of the service contract with Spotify and Spotify's legitimate interests. The user may at best contact Spotify's Data Protection Officer for further information on Spotify's balancing of its legitimate interests against the rights of users.
Finally, attention should be paid to personal data belonging to "special categories of data", i.e.,
data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation (Article 9 of the GDPR).
While Netflix (with respect to content personalisation) and YouTube (regarding advertising personalisation) explicitly exclude certain of these categories from their profiling processes (Netflix, 2021; Google, 2022b), profiling may create special categories of data from the combination of data that do not initially fall into these categories. As an example, the European Article 29 Data Protection Working Party cites a "study [that] combined Facebook 'likes' with limited survey information and found that researchers accurately predicted a male user's sexual orientation 88% of the time; a user's ethnic origin 95% of the time; and whether a user was Christian or Muslim 82% of the time" (Kosinski, Stilwell & Graepel, as cited in Article 29 Data Protection Working Party 2018, p. 15). Where preferences or characteristics belonging to these particularly sensitive categories can be inferred from profiling, the data controller must not only be able to demonstrate that the data processing is not incompatible with its original purposes, but also that it has a lawful basis for the processing (e.g., user consent), and the user must have been informed about such processing.
Users' data privacy rights and profiling
As a reflection of the above-mentioned obligations imposed on streaming platforms acting as controllers who process personal data, the GDPR recognises several users' rights. In addition to the right to be informed about the purposes of the processing, the sources of the data and the way in which the data is processed, users have a right to object, at any time, to the processing of their data, including for profiling purposes. The user must be informed of this right to object in an explicit and clear manner, separately from other information, and must be able to exercise it easily. As seen above with respect to the streaming platforms examined, the exercise of such a right would simply imply the nonuse of their services.
In accordance with the GDPR, the user should also have access to the data processed in order not only to be able to correct any inaccurate information or even delete the profile or certain data that was used to create it, but also to know in which categories or segments of users they have been placed by profiling. The user may wish to complete or correct his or her information, as well as challenge the categories and segments that have been applied by the profiling algorithms. Recital 63 of the GDPR attempts to balance these rights with the economic interests of the controller by stating that the right of access to personal data "should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software." However, "the result of those considerations should not be a refusal to provide all information to the data subject" (a certain degree of access must therefore always be available to the user) and it is up to the controller, who must balance its interests against those of the user, to demonstrate "compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject" (Article 21.1 GDPR). This seems to support the view that the users of streaming platforms should be given the means to better understand how the personalisation of recommendations is developed, based on which data, and in which categories they have been placed by each platform. We have seen that this is not the case in practice.
From the perspective of users' privacy rights, music and audio-visual content streaming platforms deserve even greater attention, as the GDPR prescribes a stricter framework for fully automated decisions when they produce legal effects on an individual (e.g., cancellation of a contract, refusal of a social benefit, refusal of citizenship, etc.) or when such a decision "similarly significantly affects him or her" (Article 22.1, GDPR). This last hypothesis may be found when an automated decision "results in influencing the person's environment, behaviour, choices or results in a form of discrimination." (CNIL, 2018, original in French). The line between a decision causing an effect that may be considered as similarly significant in its impact on users as a legal effect, and a decision that cannot be considered as such seems variable or at least debatable on a case-by-case basis (Article 29 Data Protection Working Party 2018, pp. 21-22). Do content recommendations by streaming platforms have a significant effect similar to a legal effect? Do they significantly affect the behaviour and choices of individuals? The answer to this question is fundamental, because in these two hypotheses of Article 22.1 (either producing legal effects on a user, or an impact in a similarly significant way), it is "in principle prohibited to make a decision about a person, if it is entirely automated" (CNIL 2018, original in French).
Such prohibition is expressly mentioned by Spotify, which in its Privacy Policy (Spotify, 2021) lists users' rights arising from the GDPR, explicitly stating the right "[n]ot be subject to a decision based solely on automated decision-making (decisions without human involvement), including profiling, where the decision would have a legal effect on [the user] or produce a similarly significant effect." However, to exercise such right, the user is directed to Spotify's Data Protection Officer, without further details.
In any case, Article 22.2 of the GDPR provides for three exceptions to this prohibition of fully automated decisions when a decision affects someone's legal rights or similarly significantly affects him or her. Such exceptions also apply to automated decisionmaking accompanied by profiling and cover the following situations: (a) if the decision "is necessary for entering into, or performance of, a contract between the data subject and a data controller;" (b) if it "is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests;" or (c) when it "is based on the data subject's explicit consent." Even in the case of a situation where profiling carried out by digital platforms of audio-visual and music content would be considered as having a legal effect on a user or affect him or her in a significant and similar way, such profiling could still fall within the exceptions of Article 22.2 (a) and (c), based on the performance of a contract and the explicit consent by these platforms' users.
In the cases described under (a) and (c), however, the GDPR requires that appropriate measures protect the data subject's rights, freedoms, and legitimate interests, "at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision" made about him or her (Article 22.3). A request for human intervention should not be symbolic - the decision must be controllable in a meaningful way, by someone with the power to change the decision and on the basis of the analysis of all relevant data. The exercise of these rights does not seem to be made explicit nor offered by the large streaming platforms examined in this contribution and seems to require improvement.
As the Article 29 Data Protection Working Party advises for all controllers in the processing of personal data, these platforms
should carry out frequent assessments on the data sets they process to check for any bias, and develop ways to address any prejudicial elements, including any overreliance on correlations. Systems that audit algorithms and regular reviews of the accuracy and relevance of automated decision-making including profiling are other useful measures. (Article 29 Data Protection Working Party, 2018, p. 28)
Streaming platforms should, therefore, regularly conduct data protection impact assessments to measure the risks involved in automated decision-making, including profiling, and to determine the measures that are needed to address such risks. Such measures may include regular algorithmic auditing, data minimisation, anonymisation or pseudonymisation techniques, providing information to the data subject about the existence and logic of the automated decision-making process, explaining the consequences of such processing, and establishing a clear and easy-to-use procedure for individuals to both oppose the decision made by the automated mechanisms and express their opinion (Article 29 Data Protection Working Party, 2018, pp. 30-32). These recommended safeguards reinforce the conclusion that the information to be provided to users of streaming platforms and the specific measures to protect their rights as described above should be put in place with particular attention, so that profiling for personalisation purposes is more respectful of users' rights, and less biased from a cultural diversity standpoint.
PROFILING AND THE DiSCOVERABiLiTY OF DiVERSiFiED CULTURAL CONTENT
As personalisation of content is part of the business model of streaming platforms, it is relevant to examine to what extent algorithmic recommendations based on profiling influence the users' choice of content and may thus contribute (or not) to the discoverability of culturally diverse audio-visual or music content. When it is stated in the privacy policies of streaming platforms that recommendations are based on previous consumption and viewing habits, as well as on the popularity of content in a given location or on information collected in a user's social media network, the probability for a user to discover new and diversified content seems quite low. By applying a certain categorisation to the data processed for profiling purposes, the tendency seems to be, on the contrary, to lock users into bubbles defined by their previous choices or those of their social circle.
As businesses that aim to increase their profits, streaming platforms may also tend to direct users to content that has been selected as economically relevant for the platform, notably since streaming platforms produce more and more their own content (Tchéhouali et al., 2022, p. 97). Also, from an economic point of view, platforms do not have any strategic incentive to expand content options when they are able to predict user preferences and thus reduce uncertainty about the success of broadcast content (Napoli, 2020), unless they consider that an increase in diversity would be appreciated by their users and could attract new subscribers. Such appreciation for cultural diversity and for the discovery of new cultural content from around the world may be progressively attained through education and awareness-raising on the importance of cultural diversity and intercultural dialogue, in line with Article 10 of the 2005 UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions. However, it remains a work in progress.
On YouTube, in order to avoid being locked into recommendations associated with previously viewed or searched content, it is necessary to suspend or delete one's history - or to go into private browsing mode (e.g., on Google Chrome, which immediately offers personalisation as soon as cookies are accepted). From a cultural diversity perspective, it would be interesting to allow the user to keep his or her search history, while choosing not to have recommendations on certain criteria. It would, therefore, be necessary that at least the main criteria used by algorithms and the categories in which users are placed be made accessible to them so that users are able, as prescribed by the GDPR, to challenge their inclusion in a specific category, or even to indicate certain categories that they feel would better represent them.
Algorithms applied to users' collected data may technically be programmed with the objective of promoting diversity in content supply, hence employing technology precisely in favour of diversity and the discoverability of new content, in what could be called a "governance of algorithms," in addition to a "governance by algorithms" as exercised by digital intermediaries such as streaming platforms (Burri, 2019, p. 10 and p. 14). For example, algorithms for recommending diversified content could be designed independently from the preferences previously expressed by a user if the latter had the possibility to decide not to be profiled or if he or she could explicitly express his or her interests and segments he or she would like to be part of. In fact, it was noted that some almost forgotten music tracks were able to achieve great success through the intervention of Spotify's algorithms (Carpentier, 2021).
The question remains as to what shapes cultural diversity online and how to ensure that this concept is understood as widely as possible, since, as recalled above, algorithms are ultimately programmed by people. The notion of cultural diversity held by the team of data scientists responsible for the creation of streaming platforms' algorithms is therefore fundamental in any attempt to improve the algorithms applied to determine personalised recommendations from the point of view of cultural diversity and the discoverability of diversified content.
The possibility of designing algorithms that can act in favour of cultural diversity was crucial when it came to the new obligations imposed on streaming platforms targeting European citizens following the 2018 revision of the AVMSD. Among other measures, the reviewed AVMSD requires media service providers of on-demand audio-visual media services to "secure at least a 30% share of European works in their catalogues and ensure prominence of those works" (Article 13.1 of the AVMSD). In accordance with such provisions, streaming platforms should find ways to draw their users' attention to European works in their catalogues. A non-exhaustive list of examples of means to attain such an objective is provided in the recitals of the amending Directive 2018/1808 of the European Parliament and of the Council of 14 November 2018:
(...) The labelling in metadata of audio-visual content that qualifies as a European work should be encouraged so that such metadata are available to media service providers. Prominence involves promoting European works through facilitating access to such works. Prominence can be ensured through various means such as a dedicated section for European works that is accessible from the service homepage, the possibility to search for European works in the search tool available as part of that service, the use of European works in campaigns of that service or a minimum percentage of European works promoted from that service's catalogue, for example by using banners or similar tools. (Directive 2018/1808, 2018, Recital 35)
In addition to applying specific recommendation algorithms to highlight European works while combining such recommendations with users' tastes and preferences based on their previous consumption, a platform could also theoretically propose content in a more random fashion, without necessarily tying them completely to previously expressed preferences. This certainly entails a risk that purely random suggestions are in practice ignored by users or even that users feel unsatisfied by recommendations that do not correctly reflect their tastes (Burri, 2019, p. 13). The possibility for users to choose to withdraw from profiling carried out by streaming platforms without losing access to their services might be a useful complementary tool allowing for greater balance between the platforms' interests and users' rights and concerns.
The greater visibility of European content or more diverse content of different origins would likely be promoted in a partially random way and/or after explicit consultations with users. This could, moreover, be combined with other initiatives for the promotion of diversity by streaming platforms. One idea may be to promote certain titles from their catalogues through the organisation of online audio-visual or music festivals displayed on the homepage of the user interface over specific periods (e.g., one week or 10 days) and allowing users to discover new content and artists, whether related to a specific origin or other specific themes. For example, the experience of the "My French Film Festival" (Maillet, 2016), a very successful online festival, could inspire similar initiatives on streaming platforms.
Another example may be found with Disney+, which offers a "Made in France" section in France, which has been progressively enriched with more titles in the last couple of years, and which could certainly be expanded and replicated, even if intermittently, for other origins included in their catalogue. In the last few years, Netflix has similarly put in place some relatively simple tools that may be seen as a strong step towards greater discoverability of diversified cultural content, with the addition of new categories of content. A French user may now select, for instance, "France", "European", or "International" films and series, besides traditional categories such as "Comedy", "Drama", "Thriller", or "Documentaries." Also, an option "Surprise me" has been added to the Netflix menu but seems to be strongly influenced by the ranking of most popular content streamed on the platform.
While the practical application and concrete contribution of the AVMSD to the consumption of a more diverse offer of content will need to be assessed and measured over time, these new categories and options lately offered by streaming platforms making European content more easily discoverable seem to suggest that the provisions of the AVMSD are already producing significant practical effects. Although the improvement of these tools from a discoverability perspective likely requires greater awareness around what cultural diversity means and how to value the richness of our increasingly multicultural societies, they may undoubtedly be acknowledged as a nonnegligeable step.
CONCLUSiON
While recourse to automated profiling for personalisation purposes is at the heart of streaming platforms' services, the protection of users' rights requires transparency with respect to the processing of their personal data. From a data privacy perspective, lots may still be done to improve the transparency of algorithms used for personalisation purposes and to provide users with greater control of their data, as required by the GDPR.
Users need to be able to clearly understand what data is processed and how, what their rights are, as well as the procedures for exercising such rights. This means, first, that the privacy policies of streaming platforms need to be simplified, so all processing of personal data is better understood. User-friendly explanations on how profiling is conducted are particularly necessary because of the impact of automated systems on the future choices of users and the difficulty for the majority of individuals to understand how algorithms work from a technical point of view. In addition to greater transparency, streaming platforms should act to continuously remove biases in profiling and allow users to object to such profiling (without losing access to the platform), see the segments in which they have been automatically categorised and on what basis, request human intervention to review the profiling if needed, and possibly also select categories that they want to be part of. Such improvements are also in accordance with the recent recommender system transparency obligations imposed by the Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services (the "Digital Services Act" - DSA). According to the DSA, online platforms will need to disclose the main parameters of their recommender systems and provide their users with the possibility of modifying or influencing those parameters.
Finally, greater efforts are needed to raise the awareness of the public at large with respect not only to users' personal data rights, but also regarding the importance of cultural diversity and access to a diversified cultural offer online, with a view to fostering more peaceful and tolerant societies. Algorithms used by streaming platforms for personalisation purposes can and should be directed towards greater discoverability of culturally diverse content. At the European level, the AVMSD offers a valuable illustration of this possibility, by guiding streaming platforms in such a direction when it requires prominence and a minimum quota of European works in their catalogues.
References
*Article 29 Data Protection Working Party (2018). Guidelines on Automated individual decisionmaking and Profiling for the purposes of Regulation 2016/679, adopted on 3 October 2017, as last Revised and Adopted on 6 February 2018, 17/EN WP251rev.01. Retrieved from https://www.cnil.fr/ sites/default/files/atoms/files/20171025_wp251_enpdf_profilage.pdf.
*Benhamou, F. (2016). Les industries culturelles. Mondialisation et marches nationaux. L'Economie a l'heure du numérique, Cahiers français 392, La documentation Française, Paris, 2016.
*Burri, M. (2016). Exposure diversity as a new cultural policy objective in the digital age. In L. Richieri Hanania & A.-T. Norodom (Eds.), Diversité des expressions culturelles a l'ere du numérique. Teseo. Retrieved from https://www.teseopress.com/diversitedesexpressionsculturellesetnumerique/.
*Burri, M. (2019). Découvrabilité de contenus locaux, régionaux et nationaux en ligne : cartographie des obstacles a l'acces et possibilité de nouveaux outils d'orientation. Reflexion document, 7-8 February 2019. Retrieved from https://www.dropbox.com/sh/x7zo7icvxiztqyk/AADad4-ybLBiIv_x7d 2ifCPXa?dl=0&preview=Burri_D%C3%A9couvrabilit%C3%A9+des+contenus+locaux%2C+nationa ux+et+r%C3%A9gionaux+en+ligne.pdf.
*Carpentier, L. (2021, February 15). L'algorithme, nouvelle machine a tubes. LeMonde. Retrieved 03/04/2022,from https://www.lemonde.fr/culture/article/2021/02/15/l-algorithme-nouvellemachine-a-tubes_6069977_3246.html.
*CNIL (2018, May 29). Profilage et décision entierement automatisée. CNIL. Retrieved from https:// www.cnil.fr/fr/profilage-et-decision-entierement-automatisee.
*Disney+ (2021, January 13). Legal - Privacy Policy. Retrieved from https://www.disneyplus.com/ en-gb/legal/privacy-policy.
*Disney+ (2022). Cookies Policy. Retrieved from https://www.disneyplus.com/en-gb/legal/ cookies-policy.
*European Data Protection Board (EDPB) (2021). Guidelines, Recommendations, Best Practices. Retrieved from https://edpb.europa.eu/our-work-tools/general-guidance/guidelinesrecommendations-best-practices_en.
*European Union (2010). Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audio-visual media services, as amended by Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 (Audio-visual Media Services Directive). Retrieved from https://eur-lex.europa.eu/legal-content/ EN/TXT/?uri=CELEX%3A02010L0013-20181218.
*European Union (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC (General Data Protection Regulation). Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:320 16R0679&from=FR.
*European Union (2018). Directive 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities. Retrieved from https://eur-lex.europa.eu/legal-content/EN/ TXT/?uri=celex%3A32018L1808.
*European Union (2022). Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act). Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX% 3A32022R2065&qid=1666857835014.
*Google (2022a). Google Product Privacy Guide. Retrieved from https://policies.google.com/ technologies/product-privacy?hl=en-US.
*Google (2022b). Privacy Policy, 10 February 2022. Retrieved from https://policies.google.com/ privacy?hl=en.
*Google (2022c). Security and Privacy. Retrieved from https://safety.google/security-privacy/.
*Maillet, P. (2016). Projet 'My French Film Festival'. In L. Richieri Hanania & A.-T. Norodom (Eds.), Diversité des expressions culturelles a l'ere du numérique. Teseo. Retrieved from https://www. teseopress.com/diversitedesexpressionsculturellesetnumerique/.
*Napoli P. M. (2019). Diversité de contenus a l'ere numérique: découvrabilité de contenu diversifié auxéchelons local, régiónál et national. Document de reflexion, 7-8 février 2019. Retrieved from https://culturenumeriqc.qcnum.com/wp-content/uploads/2019/03/NapoliDe%CC%81couvrabilite%CC%81-de-contenu-diversifie%CC%81-aux-e%CC%81chelons-localre%CC%81gional-et-national.pdf.
*Netflix (2021, November 2). Privacy Statement. Retrieved from https://help.netflix.com/legal/ privacy.
*Netflix (2022). How Netflix's Recommendations System Works. Retrieved from https://help.netflix. com/en/node/100639.
*Ochai, O. (2022). New opportunities and challenges for inclusive cultural and creative industries in the digital environment. In UNESCO, Reshaping Policies for Creativity - Addressing culture as a global public good. Retrieved from https://unesdoc.unesco.org/arky48223/pf0000380474.
*Richieri Hanania, L. & Norodom, A.-T. (Eds.) (2016). Diversité des expressions culturelles a l'ere du numérique. Teseo. Retrieved from<https://www.teseopress.com/diversitedesexpressions culturellesetnumerique/.
*Spotify (2021, September 1). Spotify Privacy Policy. Retrieved from https://www.spotify.com/bb/ legal/privacy-policy/.
*Spotify (2022). Personalization. Retrieved from https://www.youtube.com/watch?v=B9iWpJfw4XY.
*Tchéhouali, D., Vodouhé, C., Richieri Hanania, L. and Grondin, W. (2022). Impacts des algorithmes de recommandation des plateformes de streaming sur la protection de la vie privée des utilisateurs canadiens. Research report ALTER ALGO, les algorithmes sont-ils vraiment nos alter ego numériques?. Internet Society - Chapitre Québec Canada and ORISON - Observatoire des Réseaux et Interconnexions de la Société Numérique, Chaire UNESCO Communication et technologies pour le développement de l'UQÄM, 28 January 2022. Retrieved from https://www.ieim.uqam.ca/spip. php?article13912&lang=fr.
*UNESCO (2005). Convention on the Protection and Promotion of the Diversity of Cultural Expressions. Adopted 20/10/2005, entered into force 18/03/2007. 2440 UNTS 311. Retrieved from https:// en.unesco.org/creativity/convention/texts.
*YouTube (2022, January 5). Terms of service. Retrieved from https://www.youtube.com/ static?template=terms.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022. This work is published under https://www.mediastudies.fpzg.hr/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
SAŽETAK Članak se bavi pitanjima rizika i prilika za kulturnu raznolikost koji proizlaze iz alata za personalizaciju platformi, a temelji se na pravnoj analizi glavnih odredbi Opće uredbe o zaštiti podataka Europske unije (engl. GDPR) o profiliranju potrošača i automatiziranim odlukama, kao i na analizi politika zaštite podataka odabranih streaming platformi. U članku se ispituje kako potonji mogu u praksi utjecati na zaštitu osobnih podataka potrošača u svrhu preporučivanja personaliziranih audiovizualnih i glazbenih sadržaja na internetu te kako se takve odredbe odnose na otkrivanje raznolike kulturne ponude na internetu i, na europskoj razini, na obvezu platformi da istaknu europska djela u svojim katalozima. Zaključuje se da još mnogo toga mora biti učinjeno kako bi se poboljšala transparentnost algoritama koji se koriste u svrhu personalizacije te kako bi se korisnicima omogućila veća kontrola njihovih podataka, kako to zahtijeva GDPR.