O’Melveny Worldwide

European Commission Plan to Reform Europe’s Digital Space - Part 2 - Draft Digital Services Act

December 23, 2020

 

The second of the European Commission’s package of proposed legislation unveiled last week to reform online platforms is the Digital Services Act (DSA). The DSA is a major initiative to protect users of online platforms from illegal content and to require more transparency from the operators of such platforms. It would change the rules for liability for online content, imposing obligations for content moderation and addressing and removing illegal content. It would also establish comprehensive rules about online advertising, including targeted advertising, and would enable the development and use of smart contracts. It will affect a wide range of digital service providers—both big and small—including internet access providers, hosting providers, social media, online marketplaces and online platforms as well as their business users and customers.

The DSA is an effort to modernize and create an EU-wide uniform approach to digital services. It builds on the e-Commerce Directive, which was adopted 20 years ago, by establishing a common set of rules for online intermediaries. It seeks to foster innovation and growth and facilitate the ability of smaller platforms and providers to compete, while at the same time rebalancing the responsibilities of users, platforms, and public authorities by prioritizing the interests of EU citizens. For US-based companies, these rules would impose significant new requirements for notice, transparency, and takedown obligations relating to activities provided through their platforms.

 

The proposals will be subject to further discussion and agreement by the European Parliament, the European Commission, and the Council of the EU. While final rules are not likely to be issued until 2021, companies doing business online in the EU should follow the developments closely and assess what changes they may need to make once the rules become final. The DSA provides that EU Member States can implement steep potential fines for non compliance of up to 6% of annual income or turnover and periodic penalty payments for continuous infringements of up to 5% of the average daily turnover in the preceding financial year per day.

The key elements of the DSA include the following:

  • Updated Rules of Liability and Takedown Requirements. The DSA maintains the concept of immunity for online intermediaries (such as internet access providers, domain name registrars and hosting services) that act as mere conduits for communications, i.e. by providing temporary storage, caching or hosting of illegal content, provided certain requirements are met. However, hosting providers and online platforms must now establish user-friendly, easy-to-use notice and takedown mechanisms for illegal content.

    Online platforms must also take measures to prevent misuse of their platforms and takedown mechanisms. Subject to certain notification obligations and conditions, online platforms must also suspend their services to users that frequently provide manifestly illegal content; conversely, online platforms must also suspend processing of complaints by users where such users frequently submit complaints that are manifestly unfounded.

    Online platforms will need to establish an internal complaint-handling procedure for their users that is easy to access and user-friendly. If a user does not agree with the platform’s decision or action on a complaint, the user has the right to select an out-of-court dispute settlement body to resolve the dispute. The DSA would establish “Digital Service Coordinators” in each Member State that can offer such dispute resolution services. The DSA exempts online platforms that qualify as micro or small enterprises from these requirements.

    The DSA also establishes the concept of “trusted flaggers,” which are independent entities approved by the Digital Service Coordinators that have particular expertise and competence for the purposes of detecting, identifying and notifying illegal content. These trusted flaggers can submit takedowns even if they are not the subject of the illegal content.

    These procedures are much more expansive and protective of users than what exists under US law, where the First Amendment provides significant protection for speech. First, the definition of “illegal content” subject to the DSA requirements is quite broad and includes “any information, which, in itself or by its reference to an activity, including the sale of products or provision of services, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.” US takedown procedures and immunities are much narrower. For example, the only takedown procedure is the Digital Millennium Copyright Act, which only applies to copyright violations. US law does not have broad requirements to implement takedown mechanisms for all forms of illegal content.

 

  • New Transparency Obligations for Removal of Content. If a hosting service provider removes or disables access to information provided by users, whether done in response to a takedown request or on the provider’s own initiatives, it must now provide a “statement of reasons” to the person who uploaded the content. In addition, providers of intermediary services must provide an annual detailed report of orders received, takedown notices received, content moderation engaged in and number of complaints received.

 

  • Mechanisms for Responding to Orders. The DSA imposes obligations on providers of intermediary services to act upon orders of national judicial or administrative authorities, including orders to remove illegal content and provide information about users of the services.

 

  • New Disclosure Requirements for Terms and Conditions. Providers of intermediary services are required to provide the following information in their terms and conditions: restrictions that they impose in relation to the use of their service in respect of information provided by users; a description of policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. The disclosures must be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.

 

  • New Online Advertising Obligations. The DSA imposes greater transparency for online advertising. Online platforms must disclose in a clear and unambiguous manner and in real time that information provided is an advertisement, the person on whose behalf the advertisement is displayed and meaningful information on how the user was targeted for the advertisement. These requirements could have a significant impact on the form, content and delivery of online advertising.

 

  • New Disclosures for Online Sellers. The DSA further implements provisions that would require online platforms to obtain certain additional information about online traders, including contact information, identification documents, bank account information and a self-certification that the trader will only offer products that comply with EU law. Certain of that information can be disclosed to users of the services. In addition, online platforms shall design their platforms to ensure that pre contractual and product safety information is provided to consumers prior to purchase. These provisions are designed to make it easier to track down and stop sellers of illegal goods.

 

  • Designation of Legal Representative. A provider of intermediary services that is not established in the EU must designate a legal representative in the EU to cooperate with supervisory authorities regarding compliance with the DSA. This is similar to the requirement under the EU’s General Data Protection Regulation (GDPR) to designate a legal representative. Significantly, the legal representative can be held liable for non compliance with the DSA.

 

  • Large Platform Requirements. The DSA also imposes additional requirements specifically designed for very large online platforms that have 45 million active users in the EU:
    • Conduct an annual assessment of systemic risks resulting from use of their platforms, including dissemination of illegal content, negative effects on fundamental rights (e.g., privacy) and the intentional manipulation of the service that could have foreseeable negative effects on public health, minors, civic discourse and electoral processes or public security. The platforms are further required to implement reasonable, proportional and effective mitigation measures to address these risks.
    • Be subject to independent audits to evaluate their compliance with their obligations under the DSA.
    • Create a public repository, available through APIs, containing detailed information (but not user personal information) on online advertisements served on the platform in the prior year.
    • Provide access to the Digital Services Coordinators of any data necessary to monitor and assess compliance with the DSA.
    • Appoint compliance officers responsible for monitoring compliance with the DSA. This is similar to the position of Data Protection Officer for compliance with the GDPR.
    • Publish transparency reporting every six months including information on risk assessments, risk mitigation, audit reports and audit implementation.

 

  • Standards/Compliance and Enforcement. The DSA also includes provisions for the Commission to further develop and facilitate standards, codes of conduct and crisis protocols in extraordinary circumstances involving public security or public health. Each of the Member States would designate a Digital Services Coordinator responsible for application and enforcement of the DSA Regulation, with specific provisions outlining their responsibilities and provisions for cooperation cross-border. The Member State where an intermediary is established or, if not established, where the legal representative is designated, shall have jurisdiction to enforce the DSA. As noted above, the DSA provides that Member States can potentially impose steep fines for violations.

This memorandum is a summary for general information and discussion only and may be considered an advertisement for certain purposes. It is not a full analysis of the matters presented, may not be relied upon as legal advice, and does not purport to represent the views of our clients or the Firm. Riccardo Celli, an O’Melveny Partner licensed to practice law in the Capital Region of Brussels, the Law Society England & Wales, and Roma, Christian Peeters, an O’Melveny Of Counsel licensed to practice law in the Capital Region of Brussels and Rechtsanwalt, Germany, Scott Pink, an O’Melveny Special Counsel licensed to practice law in California, Sergei Zaslavsky, an O’Melveny Counsel licensed to practice law in the District of Columbia and Maryland, and Rebecca Evans, an O’Melveny associate licensed to practice law in the Law Society England & Wales contributed to the content of this newsletter. The views expressed in this newsletter are the views of the authors except as otherwise noted.

© 2020 O’Melveny & Myers LLP. All Rights Reserved. Portions of this communication may contain attorney advertising. Prior results do not guarantee a similar outcome. Please direct all inquiries regarding New York’s Rules of Professional Conduct to O’Melveny & Myers LLP, Times Square Tower, 7 Times Square, New York, NY, 10036, T: +1 212 326 2000.