Glossary of Platform Law and Policy Terms

Dark Patterns

Author: Nicolo Zingales

This entry discusses: (I) the notion of ‘dark patterns’, its history and evolution; (II) a taxonomy of dark patterns; (III) existing regulatory and consumer advocacy responses to dark patterns.

(i) A ‘dark pattern’ is a user interface design choice that benefits an online service by coercing, steering, or deceiving users into making unintended and potentially harmful decisions (Mathur et al., 2019)1. The expression was coined in 2010 by online designer Harry Brignull, who created an online guide and repository of cases (darkpatterns.org, currently maintained by Alexandre Darlington) and referred to a “user interface that has been carefully crafted to trick users into doing things, such as buying or signing up for things”. It should be noted that this early definition included the three central elements of deceptiveness, deliberateness, and accomplishment of the deceptive purpose. Later definitions refined the concept following a less deterministic approach, which dispensed with the requirements of specific intent and of specific effects of deception on users: for instance, Mathur et al. (2019) refer more generally to ‘benefiting an online service by coercing, steering or deceiving” (emphasis added) and Luguri and Strahilevitz (2019)2 to “user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions” (emphasis added). The term is also closely linked to the literature on malicious interface design techniques, defined as “interfaces that manipulate, exploit or attack users” (Conti; Sobiesk, 2010)3, and to the broader concept of nudging, defined as “influencing choice without limiting the choice set or making alternatives appreciably more costly in terms of time, trouble, social sanctions, and so forth” (Hausmann, Welch, 2010)4. The distinctive element, however, common to all the existing definitions, is the covert and insidious nature of dark patterns, which in certain cases may fall into legally actionable fraud, unfair commercial practices or other violations of consumer and data protection rules.

(ii) Existing literature has broken down dark patterns into different categories. The most complete taxonomy to date has been offered by Luguri and Strahilevitz (2019)5, who have reviewed existing taxonomies and identified seven general categories, each divided into types or ‘variants’, for a total of 17 types of dark patterns. The following is the list of categories and their corresponding types:

  • Nagging, which includes only one type, and is constituted by “Repeated requests to do something the firm [as opposed to the user] prefers”;
  • Social Proof, including ‘Activity Message’ (informing the user about the activity on the website, e.g., purchases, views, visits), ‘Testimonials’ (testimonials on a product page whose origin is unclear);
  • Obstruction, including ‘Roach Motel’ (asymmetry between signing up and cancelling), ‘Price Comparison Prevention’ (frustrating comparison shopping), ‘Intermediate Currency’ (set purchases in virtual currency to obscure cost);
  • Sneaking, including ‘Sneak into Basket’ (adding additional products to users’ shopping carts without their consent), ‘Hidden Costs’ (revealing previously undisclosed charges to users right before they make a purchase) and ‘Hidden Subscription’ (charging users for unanticipated/undesired automatic renewal), ‘Bate and Switch’ (customer sold something other than what’s originally advertised);
  • Interface interference, including ‘Hidden Information/Aesthetic Manipulation/False Hierarchy’ (visually obscuring important information), ‘Pre-selection’ (pre-selecting firm- friendly default), ‘Toying with Emotion’ (emotionally manipulative framing), ‘Trick Questions’ (intentional or obvious ambiguity), ‘Disguised Ad’ (inducing consumers to click on something that isn’t an apparent ad) and ‘Confirmshaming’ (framing choice in a way that seems dishonest/stupid);
  • Forced Action, including ‘Forced Registration’ (tricking consumers into thinking registration necessary);
  • Urgency, including ‘Low Stock/High-demand Message’ (falsely informing consumers of limited quantities) and ‘Countdown Timer’ (giving a message that an opportunity ends soon with a blatant false visual cue).

Another useful taxonomy is the one developed by Mathur et al. (2020) 6, who identify five dimensions alongside which dark patterns can be measured: asymmetric burden, covertness, deceptiveness, hiding of information, and restrictiveness of available choices.  

Domain-specific dark patterns have also been identified, sometimes creating new categories or types. For instance, in the privacy field, Bösch et al. (2016)7 added: ‘Hidden Legalese Stipulations’ (hiding malicious information in lengthy terms and conditions) and the French Data Protection Authority identified a range of actions interfering with privacy choices from the perspective of “pushing the individual to accept sharing more than what is strictly necessary”, “influencing consent”, “creating frictions with data protection actions” and “diverting the individual” (CNIL, 2019)8; while in the context of users spatial relationship with digital devices Greenberg et al. (2014)9 introduced ‘Captive Audience’ (taking advantage of users’ need to be in a particular location or do a particular activity to insert an unrelated interaction) and ‘Attention Grabber’ (visual effects that compete for users’ attention).

(iii) As dark patterns may constitute a violation of existing legal rules, some specific guidance has been recently issued by regulators in consumer protection (Authority for Consumers and Markets, 2020)10 and data protection field (CNIL, 2019)11. Furthermore, consumer organizations have published reports finding problematic use of dark patterns with regard to data collection (Norwegian Consumer Council, 2018; Transatlantic Consumer Dialogue and Heinrich Böll Stiftung, 2020)12 13, and academic studies have been conducted to demonstrate the influence of dark patterns on the compliance with GDPR requirements for valid consent (Nowens et al., 2020)14. These guidance documents and reports highlight the possible liability arising from dark patterns in relation to misleading and aggressive commercial practices, the violation of privacy by design and the rules on free, informed, and specific consent. They also note the insufficiency of self-regulation, which by contrast is a central feature of a legislative bill (the DETOUR Act) introduced into the US Senate in 2019 by Senator Mark Warren to prohibit large online platforms from using deceptive user interfaces, known as ‘dark patterns’ to trick consumers into handing over their personal data. The bill would entrust an industry association with the formulation of guidelines, and even a safe harbor against enforcement by the Federal Trade Commission, for design practices of large online platforms.

Some work has been done on the connection between dark patterns and data protection: user studies, which conduct experiments aimed at gauging the impact of specific dark patterns (see e.g. Utz et al., 2019)15; measurement and detection studies, which through semi-automated techniques aim to measure the prevalence of dark patterns in a specific domain (e.g., Mathur et al., 2020)16; and finally, compliance studies, aiming to examine the compatibility of certain dark patterns with existing law (e.g., Nowens et al., 2020) 17

References

  1. Mathur, A., et al. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction3(CSCW). 1-32. Available at https://arxiv.org/abs/1907.07032.
  2. Luguri, J., Strahilevitz, L. J. (2021). Shining a light on dark patterns. Journal of Legal Analysis13(1), 43-109. Available at: https://ssrn.com/abstract=3431205.
  3. Conti, G., Sobiesk, E. (2010). Malicious interface design: exploiting the user. In: Proceedings of the 19th international conference on World Wide Web. 271-280. Available at: https://doi.org/10.1145/1772690.1772719.
  4. Hausman, D. M. en B. Welch (2010). Debate: To Nudge or not to nudge. Journal of Political Philosophy, 123-136.
  5. Luguri, J., Strahilevitz, L. J. (2021). Shining a light on dark patterns. Journal of Legal Analysis13(1), 43-109. Available at: https://ssrn.com/abstract=3431205.
  6. Mathur, A., et al. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction3(CSCW). 1-32. Available at https://arxiv.org/abs/1907.07032.
  7. Bösch, C. et al. (2016). Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns. Proc. Priv. Enhancing Technol.2016(4), 237-254.
  8. Commission Nationale de l’Informatique et des Libertés – CNIL. (2019). Shaping Choices in the Digital World From dark patterns to data protection: the influence of UX/UI design on user empowerment. IP Reports Innovation and Foresight N°06. Available at:  https://linc.cnil.fr/sites/default/files/atoms/files/cnil_ip_report_06_shaping_choices_in_the_digital_world.pdf.
  9. Greenberg, S., et al. (2014). Dark patterns in proxemic interactions: a critical perspective. In Proceedings of the 2014 conference on Designing interactive systems. 523-532.
  10. Authority for Consumers and Markets – ACM. (2020). Guidelines on the Protection of the online consumer.Boundaries of online persuasion. Available at: https://www.acm.nl/sites/default/files/documents/2020-02/acm-guidelines-on-the-protection-of-the-online-consumer.pdf.
  11. Commission Nationale de l’Informatique et des Libertés – CNIL. (2019). Shaping Choices in the Digital World From dark patterns to data protection: the influence of UX/UI design on user empowerment. IP Reports Innovation and Foresight N°06. Available at: https://linc.cnil.fr/sites/default/files/atoms/files/cnil_ip_report_06_shaping_choices_in_the_digital_world.pdf.
  12. Norwegian Consumer Council. (2018). Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy. Available at: https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf.
  13. Transatlantic Consumer Dialogue and Heinrich Böll Stiftung. (2020). Privacy in the EU and US: Consumer experiences across three global platforms. Available at: https://eu.boell.org/en/2019/12/11/privacy-eu-and-us-consumer-experiences-across-three-global-platforms.
  14. Nouwens, M., et al. (2020). Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. In: Proceedings of the 2020 CHI conference on human factors in computing systems. 1-13.
  15. Utz, C., et al. (2019). (Un)Informed Consent: Studying GDPR Consent Notices in the Field. Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security (London, United Kingdom). CCS ’19. Association for Computing Machinery, New York, NY, USA, 973–990. Available at: https://doi.org/10.1145/3319535.3354212.
  16. Mathur, A., et al. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction3(CSCW). 1-32. Available at https://arxiv.org/abs/1907.07032.
  17. Nouwens, M., et al. (2020). Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. In: Proceedings of the 2020 CHI conference on human factors in computing systems. 1-13.
Published
Categorized as Entries

By Nicolo Zingales

Nicolo Zingales is Professor of Information Law and Regulation at the law school of the Fundação Getulio Vargas in Rio de Janeiro, and coordinator of its E-commerce research group. He is also an affiliated researcher at the Stanford Center for Internet and Society, the Tilburg Law & Economics Center and the Tilburg Institute for Law and Technology, co-founder and co-chair of the Internet Governance Forum’s Dynamic Coalition on Platform Responsibility.

Leave a comment