No User is an Island - Relational Disclosure as a regulatory strategy to promote users' awareness in data processing

Antonio Davola (Luiss University - University of Amsterdam)
Ilaria Querci (Luiss University)

Abstract

Digital markets are flexible and developing, and so it is privacy law.
Before and together the enactment of the GDPR, data protection rules have drawn contributions, amongst others, from sociology, anthropology, economics and marketing, in order to encompass emerging empirical and theoretical findings, and to accordingly shape the modes of users’ rights throughout their interactions with business operators.
This is, of course, because privacy has an inherent social dimension: the concept of identity and autonomy, of equality and freedom, of the meaning of social relations, and of political relations, all play a distinct role in privacy law.
Undoubtedly, a central role in constructing privacy and data protection law has been played by decision-making studies: since in its early days, individual protection has been structured according to the axioms of economic neoclassical theory related to market equilibria in competitive markets, the determinants of individual decision-making, and profit-maximizing rational strategies.
Accordingly, the attribution of rights in favor of users – and corresponding imposition of obligations on professionals – in B2C transactions has been significantly affected by the view of individuals as homini oeconomici.
Yet, as soon as deviations and diversions from the traditional paradigm emerged, law has been proven able to evolve as well, and progressively adjusted in order to encompass new approaches to online interaction that largely contrast with the rigidity of the conventional economic theory of individual behavior.
Significant improvements refer, inter alia, to the rearrangement of disclosure duties building on studies on information overload, as well as to the utilization of nudge-based regulatory strategies arising from behavioral investigations.
Accordingly, GDPR principle of transparency requires that any information or communication relating to the processing of personal data is easily accessible and easy to understand, and that clear and plain language be used, moving in favor of a substantive rather than formal transparency that can actually promote informational self-determination.
Still, some axioms of the early neoclassical model as it was originally conceived are still present in consumer law, despite being widely debated amongst economic scholars.
In particular, the assumption of a-social individualism still permeates the structure of user rights: individuals’ desires and preferences for privacy and data management are deemed exogenous, and not affected by the interaction with others, or by the observation of the behavior of their peers.
In contemporary privacy scholarship, the importance of privacy has mostly been justified by the individual interests and rights it protects, the most important of which are individual freedom and autonomy in liberal-democratic societies. From this perspective, it is the autonomy of individuals that is at stake in protecting the privacy of personal data and communication in the digital era. This perspective, however, seems insufficient to account for many other concerns raised in the debates on privacy-invasive technologies. With ever greater frequency, privacy-invasive technologies have been argued to endanger not only individual interests but also to affect society and social life more generally.
Privacy scholarship cannot address these broader societal concerns about privacy-invasive technologies, however, unless it moves beyond the traditional concept of privacy focused on the individual.
This also considering that, accordingly with this view, privacy-related rights (and, in particular, disclosure obligations) are generally regulated by means of standard, a-contextual, models, that focus on the individual relationship between the professional operator and its specific counterparty.
Whereas the GDPR states that individuals should be made aware of risks, rules, safeguards and rights in relation to the processing of personal data and how to exercise their rights in relation to such processing (Artt. 13&14 GDPR), and that the consent specific, informed and unambiguous (Recital 32), disclosure rules still rely on the idea of users as individual deciders, which are able to elaborate information and take choices without a need for contextualization.
European privacy law rests on the implicit assumption that consent to the processing of personal data and the analysis of Big Data is a purely individual choice. Accordingly, privacy lawyers mainly focus on how to empower users to make free and informed choices, for instance through debiasing and nudging.
The research focuses on an unspoken axiom, characterizing neoclassical theory and existing consumer law: the individual nature of decision making.
Accordingly, the paper investigates evidence emerging from studies and experiments that shows that consent in data processing is not only – and often – partially irrational, but also inherently relational. Then, it observes that the regulatory framework laid down by the GDPR does not take into proper account this aspect and subsequently defends the development of a system of contextualized disclosure as a tool to promote both trust in AI systems and informed consent, using personalized services as its experimental setting.

Download the file

©2023 Italian Society of Law and Economics. All rights reserved.