Glossary of Platform Law and Policy Terms


Cite this article as:
Nicolo Zingales (17/12/2021). Responsibility. In Belli, L.; Zingales, N. & Curzi, Y. (Eds.), Glossary of Platform Law and Policy Terms (online). FGV Direito Rio.

Author: Nicolo Zingales

The concept of responsibility refers, in its simplest form, to a duty to undertake a particular action or set of actions. Such duty can be legal, but also moral, social, or ethical. If it is legally enforceable, failing to fulfill the duty gives rise to liability. However, even where that enforcement is not available, failing to fulfill one´s responsibility can give rise to significant consequences from a legal, social, and even financial standpoint. For instance, a boycott of advertisers (a.k.a. ‘adpocalypse’) took place in 2016 due to an alleged failure in YouTube’s responsibility to prevent ads from being associated with terrorist content. A similar boycott, known as “Stope Hate for Profit”, occurred in 2020 due to Facebook´s failure to take responsibility for the incitement to violence against protesters fighting for racial justice in America in the wake of George Floyd, Breonna Taylor, Tony McDade, Ahmaud Arbery, Rayshard Brooks and many others.

Platform responsibility is the concept that brought together a variety of stakeholders leading to the establishment of the DCPR in 2014. As noted in the DCPR Outcome book in 2017, facing the proliferation of private ordering regimes in online platforms, stakeholders began to interrogate themselves about conceptual issues concerning the moral, social, and human rights responsibility of the private entities that set up such regimes. The use of this notion of “responsibility” has not gone unnoticed, having been captured for example by the special report prepared by UNESCO in 2014, the study on self-regulation of the Institute for Information Law of the University of Amsterdam, the 2016 Report of the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, the Center for Law and Democracy’s Recommendations on Responsible Tech and the Council of Europe’s draft Recommendation on the roles and responsibilities of Internet intermediaries.

The declination of responsibilities for a particular stakeholder is typically linked to the ‘role’ that can be attributed to it in a particular process or system: in Internet governance, this goes back to the 2005 Tunis Agenda for Information Society, which established the attributions of different stakeholders in the management of the Internet, recognizing in particular that governments should have an equal role and responsibility for international Internet governance and for ensuring the stability, security, and continuity of the Internet. Practically, this is a careful choice of wording as it does not articulate a corresponding liability, while still calling governments for action in a particular domain. Although the notion of responsibility can be exemplified or explained with reference to specific forms of due diligence or even of a duty of care, this typically remains the task of adjudicators to make those articulations in defining the scope of responsibility. Occasionally, this interpretation can be facilitated by authoritative guidelines. An example is the articulation of the due diligence process expected to be followed by businesses in relation to their human rights impact, in particular: 

(a) Identifying and assessing actual or potential adverse human rights impacts that the enterprise may cause or contribute to through its own activities, or which may be directly linked to its operations, products or services by its business relationships; 

(b) Integrating findings from impact assessments across relevant company processes and taking appropriate action according to its involvement in the impact; 

(c) Tracking the effectiveness of measures and processes to address adverse human rights impacts in order to know if they are working; and 

(d) Communicating on how impacts are being addressed and showing stakeholders – in particular, affected stakeholders – that there are adequate policies and processes in place. 

This guidance is provided by the Guiding Principles on Business and Human Rights, unanimously endorsed by the UN Human Rights Council in 2011, which establish a clear separation between the duty of States to protect human rights, the responsibility of businesses to respect them, and the joint duty of both to provide effective remedies.


Belli, L., Zingales, N. (2017). Online Platforms’ Roles and Responsibilities: A Call for Action. in: Belli, L., Zingales, N. (2017). Platform regulations: how platforms are regulated and how they regulate us. Leeds, 21-32.

Center for Law & Democracy. (2016). Recommendations for Responsible Tech. Available at:

Council of Europe. (2017). Recommendation CM/Rec 2017 of the Committee of Ministers to member states on the roles and responsibilities of internet intermediaries. Available at:

Tunis Agenda for the Information Society (18 November 2005). WSIS-05/TUNIS/DOC/6(Rev. 1)-E. Available at:

UN/OHCHR. (2011). Guiding Principles on Business and Human Rights: United Nations “Protect, Respect and Remedy” Framework. HR/PUB/11/04. Available at:

UN/OHCHR. (2016). A/HRC/32/38. Report of the Special Rapporteur to the Human Rights Council on Freedom of expression, states, and the private sector in the digital age. Available at:


Stop Hate for Profit. Available at:

Categorized as Entries

By Nicolo Zingales

Nicolo Zingales is Professor of Information Law and Regulation at the law school of the Fundação Getulio Vargas in Rio de Janeiro, and coordinator of its E-commerce research group. He is also an affiliated researcher at the Stanford Center for Internet and Society, the Tilburg Law & Economics Center and the Tilburg Institute for Law and Technology, co-founder and co-chair of the Internet Governance Forum’s Dynamic Coalition on Platform Responsibility.

Leave a comment