Glossary of Platform Law and Policy Terms

Notice-and-Staydown

Cite this article as:
Nicolo Zingales (17/12/2021). Notice-and-Staydown. In Belli, L.; Zingales, N. & Curzi, Y. (Eds.), Glossary of Platform Law and Policy Terms (online). FGV Direito Rio. https://platformglossary.info/notice-and-staydown/.

Author: Nicolo Zingales

‘Notice-and-staydown (NSD)’ refers to a system of intermediary liability where, following a qualified notice, the intermediary is required not only to remove or disable access to allegedly infringing content but also to prevent further infringements by restricting the upload on the platform of the same or equivalent content. There is some ambiguity as to whether this model would require the prevention of uploads only of identical content or it would extend also to content with minor alterations (for instance a shorter version of a previously infringing video). The latter interpretation is favored at least in Europe, after the recent ruling of the European Court of Justice in Case C-18/18, Eva Glawischnig-Piesczek v. Facebook1, where the Court ruled that the prohibition of general monitoring obligation included in art. 15 of the E-Commerce Directive “must be interpreted as meaning that it does not preclude a court of a Member State from:

  • ordering a host provider to remove information which it stores, the content of which is identical to the content of information, which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information;
  • ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content” (emphasis added).

Even prior to this ruling, however, NSD was already present at least to some degree in Germany, where courts had established under the doctrine of ‘Kern’ duty of care for hosts to review all the following infringing acts of a similar nature that are easily recognizable. This has been used to impose, for instance, the following (Husovec, 2018)2: (1) employ word-filtering technology for the name of the notified work, including on existing uploads,(2) use better than basic fingerprinting technology that only detects identical files, such as MD5, as a supplementary tool, (3) manually check external websites for the infringing links associated with the notified name of work on services like Google, Facebook and Twitter or (4) use web-crawlers to detect other links on own service.

The term NSD originates from a heated discussion around the scope of the safe harbor for hosting intermediaries, which depends upon knowledge of the infringing activity. As discussed in the entries on red flag knowledge or willful blindness, knowledge is occasionally found by courts even outside the qualified notice process, in the presence of facts that are sufficient to impute a culpable intention on the part of the intermediaries. However, although the implications of these doctrines might be somewhat similar to NSD, the obligations are fundamentally different in nature: the one imposed by the NSD arises automatically with the reception of a valid notification, rather than following an inquiry into what is reasonable to have known considering the circumstances. This means also that NSD requires platforms to filter all uploads, in order to detect content previously identified as infringing (Kuzcerawy, 2020)3, which presumably will be done in automated form, due to the sheer volume of uploads. Uploads on YouTube, for instance, amount to more than 500 hours of video per minute (as of May 2019), which is strong evidence of the need for YouTube to rely on automated content recognition technologies like Content ID (deployed since 2007).

In the view of the ECJ, automated search tools and technologies allow providers to obtain the result without undertaking an independent assessment, in particular to the extent that the notice contains the name of the person concerned by the infringement determined previously, the circumstances in which that infringement was determined and equivalent content to that which was declared to be illegal. However, one could actually question this conclusion: if a sentence or word can be considered defamatory in one context, it is not necessarily so in a different context, warranting therefore human determination at some level. This carveout from the safe harbor is likely to be even more problematic if extended to infringements of copyright or trademark law, given the challenges involved in ensuring that a machine recognizes the existence of licenses or valid defences by the alleged infringer.

In addition to the free speech concerns with the prior restraints imposed through NSD, there is substantial criticism on the economic effects of a NSD regime, in particular as it significantly raises costs for hosting platforms. While it may be convenient or even necessary for YouTube, Facebook, or other large platforms to use content recognition technologies, it can be problematic to impose these requirements on smaller players, who might need in turn to obtain a license from the bigger player to fulfil their obligation.

This was one of the major concerns of the proposal for a Directive on Copyright in the Digital Single Market, which required “information society service providers that store and provide to the public access to large amounts of works or other subject-matter uploaded by their users” to take measures, such as the use of effective content recognition technologies, to ensure the functioning of agreements concluded with rightsholders for the use of their works or other subject-matter or to prevent the availability on their services of works or other subject-matter identified by rightsholders through the cooperation with the service providers.

The final version of the Directive changed this by requiring (a) the use of best efforts to obtain licensing agreements; (b) best efforts in accordance with high industry standards of professional diligence, to prevent the availability of the works in the sense explained above; and (c) the expeditious removal or disabling of content identified by notices and best efforts to prevent their future uploads in accordance with point (b). Furthermore, it removed the specific reference to content recognition technologies, while at the same time specifying that such obligations apply only to the extent that rightsholders have provided the service providers with the relevant and necessary information (in the context of those technologies, this includes in primis the reference files to enable the content recognition).

Even more importantly, the new version of the Directive provides guiding principles on the NSD regime, by: (1) explicitly requiring Member States implementing such regime to preserve copyright exceptions and not impose general monitoring; (2) creating a three-tiered regime, where full NSD is only required for big and established players, while those who have been providing services in the EU for less than three years and which have an annual turnover below EUR 10 million would only need to comply with letter (a) above and to act upon notice for the removal of specific content, and yet those who have an average number of monthly unique visitors of such service providers exceeds 5 million would also have to demonstrate best efforts to prevent further uploads in the sense explained under letter (c); and (3) specifying that to determine the scope of the obligations imposed under this regime it must be taken into account (a) the type, the audience and the size of the service and the type of works or other subject matter uploaded by the users of the service; and (b) the availability of suitable and effective means and their cost for service providers4.

References

  1. Eva Glawischnig-Piesczek v. Facebook (2019), Case C-18/18 (CJEU 2019).
  2. Husovec, M. (2018). The Promises of Algorithmic Copyright Enforcement: Takedown or Staydown: Which is Superior and why. Colum. JL & Arts, 42, 53.
  3. Kuczerawy, A. (2020). From ‘Notice and Take Down’ to ‘Notice and Stay Down’: Risks and Safeguards for Freedom of Expression. In: Frosio, G. (2020). The Oxford Handbook of Online Intermediary Liability. Oxford University Press.
  4. Angelopoulos, C. (2020). Harmonising Intermediary Copyright Liability in the EU: A Summary. In: Frosio, G. (2020). The Oxford Handbook of Online Intermediary Liability. Oxford University Press. 315-334.

    Frosio, G. F. (2017). Reforming intermediary liability in the platform economy: A European digital single market strategy. Nw. UL Rev. Online, 112, 18.

Published
Categorized as Entries

By Nicolo Zingales

Nicolo Zingales is Professor of Information Law and Regulation at the law school of the Fundação Getulio Vargas in Rio de Janeiro, and coordinator of its E-commerce research group. He is also an affiliated researcher at the Stanford Center for Internet and Society, the Tilburg Law & Economics Center and the Tilburg Institute for Law and Technology, co-founder and co-chair of the Internet Governance Forum’s Dynamic Coalition on Platform Responsibility.

Leave a comment