The Digital Services Act: Obligations and Impacts on Online Service Providers

Introduction

The European Union’s Digital Services Act (DSA), along with its twin Digital Market Act (DMA), are by far the most ambitious pieces of legislation that the EU issued with a clear intention of reining in the American Big Tech companies like Google, Facebook, Twitter, and their likes. The Act will almost certainly have great impacts on the global Internet ecosystem as it becomes implemented, first partially, and then fully over the coming months up to February 2024.

The DSA sets rules for defining providers of intermediary services, illegal content, goods, and behavior online, and rules for service providers’ liability. It also imposes various obligations based on the type and size of service providers. These will create new realities in the digital services market, both within Europe and globally. Direct impacts are expected on users’ experience everywhere in the world. Also, the new Act set precedents that can’t be ignored by any country seeking to regulate digital services within their territories.

This paper seeks to provide a comprehensive introduction as possible to the Digital Service Act, its road through a process that took 3.5 years till it finally got officially into force on November 16, 2022, its objectives, most important rules, obligations it sets for service providers, new mechanisms and bodies created for its implementation, and the new realities it is expected to create and their impacts on people living outside the EU.

The main source of information for this paper is the texts of the Act as published by the EU Official Journal on October 27, 2022. Relevant Recitals and Articles of the Act will be appropriately referenced by their numbers between parentheses when needed.

Timeline

The Digital Services Act has been in the making for almost three and half years, since the idea for proposing it was first announced in May 2019, till its publish in the EU Official Journal in October 2022.

The intention to propose a Digital Services Act was mentioned in the document entitled “A Union that strives for more: My agenda for Europe” by Ursula von der Leyen as a candidate for the European Commission presidency in 2019. In this document the Act was part of a wider program entitled “A Europe Fit for the Digital Age.” A paper about the outline and main feature of the proposal was distributed internally in June, apparently by the Directorate General of commerce. It was leaked to the public in July of the same year.

In April 2020, three European Parliament committees; the Committee on Internal Market and Consumer Protection, the Committee on Legal Affairs, and the Committee on Civil Liberties, Justice and Home Affairs, issued reports with recommendation for the Commission concerning the Digital Services Act that was expected to be proposed before the end of the year.

Public consultations were supposed to be opened in March 2020. They were however delayed because of the Covid crisis.  The consultations were eventually opened on June 2nd, 2020 and closed on 08 September 2020. The commission defined two broad questions the new regulation was meant to address, the first is updating the e-commerce directive of 2000 with particular focus on liability exemptions and the country-of-origin principle, and the second is ensuring a fair, open, and contestable market in the presence of large platforms. Six areas were defined by the commission for examination by the consultation within the previous two questions. These are online safety, liability, market dominance, online advertising and smart contracts, self-employment issues online, and the potential future governance framework for online services.

The European Commission’s proposal for the Digital Services Act was submitted to the European Parliament and the Council on 15 December 2020.

On November 25, 2021, the Council announced its agreed position on the proposal for the DSA. The statement of the council had the title “What is illegal offline should be illegal online.” The most important amendments by the Council were: explicit mention of online search engines, enhanced protection of minors online, additional obligations for online marketplaces and search engines, stricter rules for very large online platforms (VLOPs), and new exclusive enforcement powers for the European Commission to deal with infringements by VLOPs and VLOSEs.

A full year after the proposal was submitted, the European Parliament IMCO Committee adopted its DSA report that included many amendments to the original draft.

As per a rather informal mechanism of the European Union presiding institutions, a trilogue (three ways negotiations) among the European Commission, the European Parliament, and the Council was conducted to reach a “political” agreement with the proposed Act. Such agreement was reached on April 23, 2022. The European Parliament finally adopted the proposal on July 5, 2022.

The Digital Services Act was published by the European Union Official Journal on October 27, 2022. According to the Regulation itself, it came into force 20 days later, on November 16, 2022. Application of the law however has gradual timings, but the latest of them when it will become applicable to all entities subject to it will be on February 17, 2024.

Motivations and Objectives of the DSA

The Digital Services Act was first of all motivated by the need to curb the political, economic, and social impacts of information disseminated through intermediary digital services, in particular online platforms, and more specifically very large online platforms and search engines.

Economic losses due to the sale of counterfeit and copyrighted goods online were also an important motivation for the Act. Also, as an economic motivation the sheer size and big influence of big tech companies, providers of very large online platforms, have turned them into gatekeepers manipulating the entry into the market of intermediary services market and rendering competition in this market almost non-existent. European companies in particular have no chance competing with the American giants like Facebook, Google, Amazon, etc.  Finally societal disruptions feared due to the dissemination of different types of information was also a main motivation for the DSA.

The above-mentioned motivation has been reflected clearly in the Act’s proposal and its final text. Unlike other similar regulations, the DSA adopts a type and size criteria approach for imposing obligations on intermediary services with different obligations set for different types of intermediaries, more obligations imposed on online platforms, and even more set for the ones designated as very large online platforms or search engines (VLOPs and VLOSEs).

The objectives of the DSA as stated by official documents of the European Commission and Article (1) of the Regulation include:

Providing better protection to users and their fundamental rights online, both as recipients of intermediary services and consumers of services mediated by them.

Regulate clear obligations for digital services providers ensuring transparency and accountability.

Harmonizing applicable rules across the member states and preventing the fragmentation of the unified internal market due to different laws dealing with the same subject matters in different member states.

Structure of the DSA

The DSA deals with one core issue “dissemination of illegal information/content”, around which it works through three main areas: exemption of liability, obligations, and implementation. Besides the very detailed explanatory notes/recitals (there is exactly 156 of them) of the preamble, the Regulation itself is divided into 5 Chapters, both the first and last of them deal with general matters, though Article 3 of the first chapter which defines the terms used by the Regulation is of great importance. Chapters 2-4 each deal with one of the above-mentioned main areas.

The rational of the Regulation goes like this: illegal information/content of different types pose various risks (recital 1). To minimize these risks there is a need to limit to the minimum the illegal information/content the public is exposed to through the Internet. Intermediary digital services are responsible for allowing access to, making available, and/or dissemination of all types of information/content online, including those deemed illegal. To achieve the objective of limiting to the minimum the illegal information/content disseminated online, providers of intermediary services should behave responsibly and diligently (recital 3).

Additionally, for the main objective of the Regulation to be achieved effectively, its rules must be applied to all intermediary services whose recipients live within the borders of the EU member states, whether they were citizens of these state or foreign residents of them. Given the cross-border nature of the Internet, providers of intermediary services can be entities established or located anywhere in the world. Limiting the application of the Regulation’s rules only to providers established or located within the EU borders deprives them of bottom-line effectiveness, required for achieving their objectives. Thus, any rules set by the Regulation should apply to providers of intermediary services regardless of their place of establishment or location, if they have a “substantial connection to the Union.” (Recital 7)

Rules of Liability

For a long time, the rule was that providers of intermediary services of all types are exempt from liability for the information they transmit, store, or disseminate on behalf of recipients of their services. European laws, however, have always set a condition of ignorance for this exemption to be applied. While this rule hasn’t changed with the Digital Services Act, it was greatly nuanced. A lot of effort was made to specify exactly when, why, and how a provider of an intermediary service may preserve its immunity from legal liability. The rules were also detailed in accordance with the type of services provided.

A provider of an intermediary service can enjoy its immunity from legal liability as far as it doesn’t interfere with the information it handles for any purpose in a way that makes the content of this information knowable to it. In case the provider has obtained knowledge of the content of information, by any legal means, it can only preserve it exemption from liability by taking prompt action whether this was the removal of or disabling access to the information of concern. Additionally, the Regulation indirectly sets compliance with its obligations as a condition for the providers of intermediary services to keep their exemption of liability.

What the DSA applies to

The DSA has quite a comprehensive and hence, an expansive scope. First, it makes sure that its rules are generally applicable to all types of intermediary services, defined by (Article 3) to include:

Mere Conduit: means any service that only transmits information provided by a recipient of the service or provides access to a communication network. An example of such services is Internet Services Providers, they simply provide their clients with access to the communication network we call the Internet.

Caching: means any service that performs automatic, intermediate and temporary storage of information before transmitting it to its destination. The purpose of storing the information is to make its transmission more efficient.

Hosting: means any service that stores information that its recipient has provided and requested its storage.

It is clear, even on such a level of abstraction, that different types of intermediary service are variously involved with the information they deal with. Therefore, the DSA adopts a differentiated approach to each type of intermediary service, with obligations going from the general imposed on all intermediary services to the specific obligations imposed only on those hosting online platforms designated as very large online platforms (VLOPs), and very large online search engines (VLOSEs).

Extra-Jurisdictional Applicability

As mentioned before, to be effective in any meaningful manner the Digital Services Act had to be designed to be applicable to providers of intermediary services not established or located within the borders of the European Union. In law jargon this is called extra-jurisdictional legislative act, which is generally discouraged under international law. This is however not the first time that the EU institutions resort to such controversial legislative tactics. This time however, extra pain was taken to set the rules of applicability of the law’s provision to foreign providers of intermediary services.

Recital (7) of the explanatory notes first explains why application of the Act’s rules to providers of intermediary service “irrespective of their place of establishment or their location.” It then goes on to set the condition that such providers “offer their services in the Union”, which can be proved by “a substantial connection to the Union.”

Next, Recital (8) goes into details of what constitutes “a substantial connection to the Union,” basing that on the significance of the number of recipients of the service in one of more Member States, “targeting of activities towards one or more Member States” which is evidenced by “the use of a language or a currency generally used in that Member State,” “the possibility of ordering products or services,” in one or more Member States, “the use of a relevant top-level domain” (i.e. one that is assigned to a Member State such as .fr or .de for France and Germany respectively, or the Unions top-level domain .eu itself), a relevant application being available in the national application store, local advertising of the service, or using a language used on one or more Member State for advertisements, conducting customer relations “such as by providing customer service in a language generally used in that Member State.”

Illegal Content

The Digital Services Act doesn’t specify exhaustively, or in any way, what constitutes illegal content. It rather delegates the specification of illegal content to other Union laws, and to domestic laws of the member states on condition of being compliant with Union laws. Accordingly, Article (3), para (h) of the Regulation defines an illegal content to mean:

[…] any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.

Recital (12) of the explanatory notes goes into detail defining the “concept” of illegal content. It is mainly meant as guidance for what counts as compliant with Union laws of domestic provision designating some content as illegal.

Obligations Imposed by the DSA

Providers of intermediary services

The obligations imposed on all providers of intermediary services irrespective of their type or size include designation of a single point of contact for direct communication (by electronic means) with Member States authorities, the Commission and the newly established Board. (Article 11) Similarly a single point of contact should be designated for direct communication with recipients of the service with conditions of allowing the service recipient to choose the means of communication, which can’t be limited to automated tools.

Each provider of intermediary services is also required to assign a legal representative in at least one of the Member States (Article 13). One reason why this is required is the need of competent authorities to have a point of application for actions against the provider in case of infringements on the law.

On the side of transparency Article (14) sets an obligation by which providers of intermediary services are required to make public their services terms and conditions in a friendly and clear manner. Article (15) sets an obligation by which providers of intermediary services are required to publish periodical transparency reports and goes into exhaustive details of what must be included in these reports as per each type of intermediary service.

Hosting Services

Articles (16, 17 and 18) set obligations that are applicable to the providers of hosting services. These include:

  • Establishing a Notice and Action Mechanism, whereby service providers should “allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content.” Such notifications are significant as they prevent the provider from befitting from the general legal exemption from liability based on ignorance.
  • Statement of Reasons, whereby a service provider should provide “provide a clear and specific statement of reasons” to any recipient of service whose content was considered illegal and thus incurred some action of restriction, suspension or termination of service, or suspension or termination of the recipient service account.
  • Notification of Suspicions of Criminal Offences, whereby a hosting service provider is obliged to inform the law enforcement or judicial authorities of concerned member states once it becomes aware of information indicating a criminal offence.

Online Platforms

As per the Regulation, online platforms are host services that disseminate information to the public beside storing it on request by the service recipient. (Article 3, i) Dissemination to the public means making the information available to potentially unlimited number of recipients. (Article 3, k) Online platforms are subject to an extra layer of obligations specified by Articles of both Sections 3 and 4 of Chapter 4 of the regulation. Section 3 deals with all providers of online platforms. Section 4 deals with additional provisions applicable to providers of online platforms if they allow consumers to conclude distance contracts with traders.

The obligations specific to all providers of online platforms services include:

  • Complaint Handling: Providers of online platform services are obliged by the Regulation to implement an internal complaint-handling system for recipients of the service including individuals or entities that have previously provided notices flagging some content as illegal. The system should provide options for challenging the decisions made by the provider of the online platform service for at least six months after a decision was made, Provider of the service should inform complainants with its response without undue delay. It also should inform the complainant of the possibility of resorting to out-of-court dispute resolution.
  • Out-of-court dispute settlements, is an unbinding mechanism that allows third-party entities, certified by the Digital Services Coordinator of the member state where they are established, to provide out-of-court dispute settlement services, on request by complainants challenging the decision of a provider of online platform service. The provider of the online platform service is not obliged to accept the resort to the out-of-court settlement, or to implement its decision. Also, the body providing the out-of-court settlement does not have the power to impose a binding settlement on either party to the dispute.
  • Trusted Flaggers are individuals or entities (might be official authorities of a concerned member state), certified by the relevant Digital Service Coordinator, based on having some sort of expertise in determining illegal contents. Providers of online platforms services are obliged to prioritize notifications provided by these “trusted flaggers”, deeming some content illegal.
  • Measures related to advertising, if providers of online platform services present advertisements on behalf of some of their services recipients, they are required first to make it easy for other service recipients to recognize that the information presented is an advertisement. They are also required to provide the recipients of their services with information of the identity of the advertiser, the identity of the individual or entity paying for the advertisement if different from the advertiser, and the parameters used to determine the recipient targeted with this advertisement.
  • Recommender system transparency, which includes making public what strategy is used by these systems to target specific recipients of service. Importantly, there is an obligation of allowing users to opt-out of being targeted on some specific base, along with prohibition of targeting individuals based on identity characteristics, and general prohibition of targeting minors based on any personal information.
  • Other obligations specific to providers of online platforms services include measures and protections against misuse of the service, the notice and action, extra transparency reporting obligations, and online protection of minors.

For providers of online platform services who facilitate concluding distance contracts with traders the following extra obligations apply.

  • Traceability: Providers of online platform services are obliged to acquire identification information of any trader they allow consumers to conclude distant contracts with, including name, address, telephone number and email address; identification document, payment account details, the trade register, registration number, or equivalent, if applicable. Providers are required to check the validity of the information provided by the trader. Starting on 17 February 2024, the providers should obtain the mentioned information within 12 months from traders already using their services. In case of traders failing to deliver this information, the providers should suspend the provision of their services to them.
  • Right to information: In case a provider of online platform services becomes aware that an illegal product or service were offered by a trader to consumers (located in the EU), this provider should inform those consumers with this fact, the identity of the trader and any means of redress. If the provider doesn’t have contact details of all consumers concerned, it should make the information mentioned publicly available.

VLOPs and VLOSEs

VLOPs and VLOSEs are categories created by the DSA based on the number of services’ monthly active recipients residing in the EU. This number is specified by Article (33) to be equal to or higher than 45 million. An online platform or a search engine is designated as a VLOP or a VLOSE by a decision made by the Commission, of which it should inform the concerned provider, the Board and the Digital Services Coordinator of establishment (i.e., of the member state where the provider is established if any). The list of providers of VLOPs and VLOSEs is published in the Official Journal of the European Union and updated on need.

Additional obligations set by the Regulation for VLOPs and VLOSEs are applicable to each of them 4 months after being notified with the Commission’s decision to designate it as such. These obligations include:

  • Risk Assessment: Providers are required to conduct a periodic assessment of any systemic risks whose sources are the design or functioning of their services and related systems. Specific factors are particularly specified by the Regulation, whose effects should be assessed. These are; the design of the service’s recommender system and any other relevant algorithmic system; content moderation system; applicable terms and conditions and their enforcement; selection and presentation of advertisements; and data-related practices. Providers will first provide such an assessment on the date of law application to them (4 months after designation), then annually after that. They will preserve supporting documents used in their assessments for at least three years after carrying them out, and will be obliged to communicate them, on request, to the Commission and to the Digital Services Coordinator of establishment. (Article 34)
  • Mitigation of Risks: Based on assessments of risks conducted by providers of VLOPs and VLOSEs as per Article (34), those providers are required to take mitigation measures of identified risks. Such measures may include making changes to different elements of the services. These elements might include design and features of the services, terms and conditions, content moderation processes, algorithmic systems (esp. Recommender systems), advertising systems, cooperation with trusted flaggers and implementation of settlements provided by out-of-court dispute settlement bodies and ensuring that fake media items are distinguishable from the original with markings.
  • Independent Audit: Providers of VLOPs and VLOSEs are required to be subject, at least once a year, to audits performed by independent organizations, at their expense. These audits will cover compliance with the obligations set by this Regulation and commitments set by codes of conduct, and crisis protocols. Organizations performing the audits should have complete independence from the provider of the service and ensure by specific conditions to be as impartial as possible. In case of receiving an audit report that is not positive, the provider of service should adopt, within a month, an audit implementation report including the measure it will take based on the operational recommendations provided by the audit report.
  • Recommender Systems: In addition to obligations related to recommender systems for all providers of online platform services, providers of VLOPs and VLOSEs are required to provide at least one option for each of their recommender systems which is not based on profiling of targeted recipients.

Competent authorities and supervising bodies

Implementation and enforcement of the DSA provisions in each of the Union’s member states are delegated to the competent authorities of such member states. The regulation obliges each member state to designate one or more of its competent authorities to be responsible for the supervision of providers of intermediary services and enforcement of the Regulation. Each should also designate one of these competent authorities to be its Digital Services Coordinator by 17 February 2024. Besides Digital Services Coordinators, the Regulation gives the European Commission new powers to supervise and enforce provisions related to VLOPs and VLOSEs included in Section 5 of Chapter 3. (Article 56). The regulation also establishes a new body, the European Board for Digital Services.

Digital Services Coordinators

Each member state of the EU will designate an existing or newly established competent authority as its Digital Services Coordinator. Member states are obliged to ensure that their Digital Services Coordinators have the necessary technical, financial and human resources to supervise the providers of intermediary services falling within their competence. Digital Services Coordinators should have complete autonomy, including in managing their own budgets to ensure that they will act independently and be free from any external influence.

To carry out their responsibilities, Digital Services Coordinators are given powers of investigation that they can either use on their own to perform related actions or by requesting a judicial authority to order them. Such actions may include requesting the provider or other individuals of entities to submit information they have related to an investigated infringement, and the inspection of premises, including seizure, taking, obtaining copies of information.

Digital Services Coordinators are also given enforcement powers, including accepting commitments that providers offer related to compliance with the Regulation, ordering the cessation of infringements, imposing remedies, fines, periodic penalty payments, and requesting the judicial authority to order temporary restriction of access of recipients to the services on failure to comply by their provider.

European Board for Digital Services

The Regulation establishes an independent advisory group named the European Board for Digital Services, referred to throughout the as the “Board”. The Board advice the Digital Services Coordinators and the Commission concerning the consistent application of the Regulation, coordinates and contributes to the guidelines and analysis of the Commission and Digital Services Coordinators on emerging issues and assist them in the supervision of VLOPs.

The Board is composed of the Digital Services Coordinators each represented by a high official. The Commission will chair the Board, convene its meetings and prepare the agenda for them. Each member of the Board will have one vote, except for the Commission that will have no voting capacities. Decisions will be made with simple majority.

Expected Impacts on Human Rights and Internet Governance

As per the Regulations provisions themselves, protection of fundamental rights of European citizens is one of its major objectives. It can be noticed by reading the Regulation’s that it does enhance the protection of some rights, especially consumers rights, as the law guarantees much more transparency for information about goods and services purchased through online platforms and about traders as well. It also provides a level of insurance for the legality of goods and services offered for sale online. As it concerns the right to privacy, the law makes a few steps forward compared to the current situation. It prohibits using some categories of personal data by recommender algorithmic systems. It also prohibits the use of personal data at all in the case of minors. But the DSA hasn’t stopped the use of profiling for recommender system altogether on the basis of balancing the right to privacy on one side against the right to make profits on the other.

The right to freedom of expression is the one with the most ambivalent status in the Regulation, although it has vowed to protect it. On a very important aspect, the law makes the recipients of intermediary services subject to the application of all the Union member states’ laws when it concerns determining which content is illegal. This expansion means that the service recipient might be exposed to damages due to the application of a law that its applicability to her is not constitutionally valid, as it is not in force in her country of residence. More dangerously, the enforcement of these laws has become the job of providers of intermediate service, with the interference of other parties like the trusted flaggers, out-of-court disputes settlements bodies, and any individual of entity flagging a content as illegal. In all cases the decision is made by the provider of the service which becomes an investigator, a judge, and a law enforcer, taking the decision of removing, restricting access to a content or suspending or removing the account of the service recipient. While means of redress are provided but the decision itself is still suspected of unconstitutionality.

It becomes even worse when we consider that a service provider may take actions against a recipient of the service residing outside the Union. It is true that the law doesn’t oblige the provider of the service to apply its decisions in compliance with the law globally, which means that the provider can limit the application of such actions within the Union’s borders. Providers of services however may choose to generalize their decisions concerning content deemed illegal as per the EU and its member states’ laws globally. Accordingly, recipients of the service will be subject to laws they have no connection to them, whatsoever. They can potentially incur damages, but in their case, no means of redress would be available as the law provisions concerning such means are not in force in their countries.

When it comes to Internet Governance the DSA is one of the most important breakthroughs achieved by state governments and similar institutions against the almost full autonomy of the private sector actors when it comes to content moderation, which is today one of the most important fields of Internet Governance. The law also sets precedents that can’t be ignored in this field. This means that the EU is practically taking on its own the setting of governance rules of one of the major Internet functions which will have great impacts on its users and working companies everywhere in the world. This particularly damages the multistakeholderism model that is still prevalent in Internet Governance.