Glossary of Platform Law and Policy Terms

Right to Explanation

Cite this article as:
Nicolo Zingales (17/12/2021). Right to Explanation. In Belli, L.; Zingales, N. & Curzi, Y. (Eds.), Glossary of Platform Law and Policy Terms (online). FGV Direito Rio.

Author: Nicolo Zingales

See also appeal

The concept of ‘right to explanation’ refers to the informational duties owed by a data controller to a data subject in relation to automated decisions based on profiling. The basic provision for the construct of the ‘right to explanation’ is Article 22 of the General Data Protection Regulation (GDPR), which establishes a right for any data subject not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. This provision is the evolution of article 15 of the Data Protection Directive (DPD), in turn finding its historical root in the French law of 1978 “on computing, files, and freedoms” which provided a broader right not to be subject to any decision involving an appraisal of human behavior based solely on the automated processing of data which describes the profile or personality of the individual. The same law also granted the right to know and challenge the information and reasoning used in such processing in case the data subject opposed the results.

While the scope of this right is much narrower both in art 15 DPD and art 22 GDPR, one can discern from the Directive’s Travaux Préparatoires the same concern for human dignity-specifically that humans maintain the primary role in ‘constituting’ themselves instead of relying entirely on (possibly erroneous) mechanical determinations based on their “data shadow”. Arguably, that concern underlies art. 22 GDPR despite the more specific focus in its Travaux Préparatoires on the risks of decisions based on profiling, which is defined in art. 4(4) as “any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular, to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements”. Accordingly, the explicit mention of profiling could be interpreted simply as illustrative of one of the possible risks involved in automated processing.

Another important difference of art 22 GDPR from the text of art 15 DPD is the requirement of “suitable” safeguards in case of application of any derogations to the right established art 22(1), which are admitted only were provided by a law to which the controller is subject. ‘Suitable measures’ to safeguard the data subject’s rights and freedoms and legitimate interests are also required when applying one of the exceptions to the rule laid down in art 22(1), i.e., the necessity for entering a contract or performance thereof, and data subject’s explicit consent –a novelty introduced by the GDPR. Detailing the application of these exceptions, but arguably also informing the application of derogations, article 22(3) specifies that ‘suitable measures’ consist at least of the right of the data subject “to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision”.

This is an aspect that has triggered a significant discussion on the existence of a right to explanation, as the suitable safeguards listed in Recital 71 of the GDPR include the right “to obtain an explanation of the decision reached after such assessment”, but this wording is conspicuously absent from the text of art. 22(3). Regardless of the qualification of the right provided by art 22 as one to “an explanation” (as we’ll call it here for sake of simplicity), to ‘information’ or to ‘legibility’, it is indisputable that the article seeks to put data subjects in the position to appreciate at least to a certain degree the logic of any algorithm relied upon to take measures which significantly affect them. Besides the obvious question of what is the requisite degree of transparency and granularity of an explanation, the provision leaves ambiguous two important issues: (1) whether art 22 implies a prohibition of processing personal data without fulfilling the relevant criteria, or rather a right for the data subject to actively object to such processing; and (2) whether the requisite degree of transparency and granularity for requested data and last but not least, the possibility for data controllers to condition access requests to the payment of a fee.

Even admitting that consumers were able to use and keep perfect track of all the revealed data, they would still largely ignore predictions and decisions based on such data, at least in the absence of proactive measures taken by data controllers to that effect. EU data protection law specifically addresses this problem establishing the right for individuals to obtain basic knowledge on the logic of any automated decisions that produce a legal effect or significantly affect them otherwise, and a right not to be subjected to such decisions outside a narrow set of circumstances. Regrettably, the right to obtain such knowledge has historically been under-enforced, mainly due to the ambiguity of article 15 of the Data Protection Directive concerning its scope of application. However, there is reason to think that this situation will change in light of the forthcoming General Data Protection Regulation (GDPR), which details the minimum safeguards that should be offered when such decisions are made.


Edwards, L., Veale, M. (2017). Slave to the algorithm: Why a right to an explanation is probably not the remedy you are looking for. Duke L. & Tech. Rev.16, 18.

Goodman, B., Flaxman, S. (2017). European Union regulations on algorithmic decision-making and a “right to explanation”. AI magazine38(3), 50-57.

Malgieri, G., Comandé, G. (2017). Why a right to legibility of automated decision-making exists in the general data protection regulation. International Data Privacy Law.

Malgieri, G. (2019). Automated decision-making in the EU Member States: The right to explanation and other “suitable safeguards” in the national legislations. Computer law & security review35(5), 105327.

Mendoza, I., Bygrave, L. A. (2017). The right not to be subject to automated decisions based on profiling. In EU Internet Law. Springer, Cham. 77-98.

Selbst, A., Powles, J. (2018, January). “Meaningful Information” and the Right to Explanation. In Conference on Fairness, Accountability and Transparency. PMLR. 48-48.Wachter, S., Mittelstadt, B., Floridi, L. (2017). Why a right to explanation of automated decision-making does not exist in the general data protection regulation. International Data Privacy Law7(2), 76-99.

Categorized as Entries

By Nicolo Zingales

Nicolo Zingales is Professor of Information Law and Regulation at the law school of the Fundação Getulio Vargas in Rio de Janeiro, and coordinator of its E-commerce research group. He is also an affiliated researcher at the Stanford Center for Internet and Society, the Tilburg Law & Economics Center and the Tilburg Institute for Law and Technology, co-founder and co-chair of the Internet Governance Forum’s Dynamic Coalition on Platform Responsibility.

Leave a comment