The European Commission has just published its proposal to regulate digital services in two texts which even it considers ambitious: the Proposal for a Regulation on Digital Markets (analyzed here) and the Proposal on a Regulation for Digital Services (DSA) which we address below. In forthcoming articles, we will take a close look at the many changes that lie ahead. Today, by way of introduction, we provide a summary of the main obligations (and rights) contained in the Proposal for a Regulation on Digital Services.
As the Commission rightly states, society nowadays has little to do with the emerging digital market of the turn of the millennium, regulated by the e-Commerce Directive (implemented in Spain through the Information Society Services and Electronic Commerce Law). To put it in perspective, remember that in 2000, when this directive was published, companies like YouTube, Facebook, Instagram, LinkedIn, Spotify and TikTok did not even exist.
Firstly, the Commission seeks to respect the principles underlying the e-Commerce Directive, which undoubtedly were of great help in encouraging the emergence of all the digital services we have today, and which we can’t live without. Secondly, in line with its Communication Shaping Europe’s Digital Future, the European Commission continues its commitment to update the horizontal rules that define the responsibilities and obligations of providers of digital services and online platforms in particular. It intends to do this while also strengthening the protection of European citizens’ fundamental rights and boosting the creation of fairer and more open digital markets. This is not an easy task.
The aim is for the DSA to help remove illegal products, services and content from the Internet (including hate speech, terrorism, fake news, the sale of dangerous products, counterfeits, child pornography, etc.). The Commission considers that the regulatory framework established in the e-Commerce Directive has not been sufficient to combat these types of content and that, therefore, Member States were regulating independently, creating a patchwork of regulations that could fragment the single market and offering unequal protection to citizens.
In this scenario, following the consultation process that started at the beginning of the year and, also, bearing in mind the proposals of the European Parliament, the Commission considers it necessary to introduce a harmonized regulatory framework that hinges on four points. Let us take a look.
Liability of digital service providers
Up until now, the liability of digital service providers was based on two key principles. For one, they were not liable for illegal content they hosted or transmitted if they did not have actual knowledge of such content. In addition, digital service providers were not subject to a general monitoring obligation to prevent the publication or transmission of illegal content.
Although the DSA maintains both principles, it introduces new obligations for providers of hosting services and online platforms. These include: (i) creating specific processes to request the removal of illegal content, including a statement of reasons for the removal; (ii) implementing methods to enable users to defend themselves if they consider that their content has been withdrawn unfairly, for example infringing their rights of freedom of expression and information; and (iii) the obligation to cooperate with the authorities of the Member States both in the removal of illegal content and in identifying certain users.
Due diligence obligations
The Commission aims to increase transparency with respect to how digital service providers operate. The objective is not to interfere with their freedom to conduct a business, but rather to increase their transparency to users, both consumers and professional users.
The measures to be introduced include the following: (i) establishing a single point of contact; (ii) designating a legal representative if they do not have an establishment in the Union but offer services in the Union; (iii) providing a description of the policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making; (iv) publishing information on requests for removal of illegal content received from third parties (such as public authorities or citizens) or as a result of their voluntary monitoring activities.
Additional obligations for online platforms
The third set of obligations is directed specifically at online platforms, which are defined as “a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation”.
The specific obligations that are only applicable to online platforms include: (i) creation of internal complaint-handling systems to manage the removal of illegal content and/or the suspension or termination of the services and/or of users’ accounts; (ii) cooperation with alternative dispute-resolution services (such as mediation mechanisms); (iii) preferential response to removal notices by trusted flaggers; (iv) temporary suspension of accounts of repeat offenders; (v) securing the necessary information for the traceability of traders that offer distance sales of products or services on the platform; and (iv) duly identify the advertising displayed on their interfaces, including information about the main parameters used to determine the recipient of the advertising.
Micro and small enterprises are excluded from these obligations because it is considered a disproportionate burden that could bar new competitors, and thus act as a disincentive to the creation of digital companies in the EU.
Additional obligations for very large online platforms to manage systemic risks
One question that could initially arise is what is meant by “very large online platforms”: mainly those with at least 45 million monthly active recipients of the service in the Union. Once again, the aim is to only apply the most burdensome obligations to the platforms that are equipped to deal with them.
Here we are talking about obligations aimed at identifying possible risks and, as far as possible, offsetting those risks. The obligations that the Commission seeks to lay down include: (i) preparing a risk analysis to determine, at least, the risks arising from the dissemination of illegal content, any negative effect on the exercise of users’ fundamental rights, and possible manipulation of their services that may have a negative effect on the protection of public health, minors, civic discourse, electoral processes and public security; (ii) including mitigation measures tailored to the risks identified, such as adapting their content moderation and/or recommender systems, initiating or adjusting cooperation with trusted flaggers or limiting the display of advertisements; (iii) performing audits at least once a year to assess compliance; (iv) describing the parameters used in their recommender systems; (v) creating a repository of the information published; and (vi) appointing a compliance officer responsible for monitoring compliance.
Supervision systems and possible penalties
As we have seen, digital service providers and especially large online platforms that offer services in the Union have many obligations. In addition, the Commission proposes that a Digital Services Coordinator be designated in each Member State to determine the degree of compliance with the Regulation, investigate providers’ activities and, where necessary, impose penalties. Therein lies one of the most controversial points of the proposal: penalties can amount to up to 6% of the service providers' annual turnover.
Following the classification prepared by the Commission itself, we summarize the new cumulative obligations that the Digital Services Regulations seeks to introduce for each type of provider:
What comes next?
Following the ordinary legislative procedure, the Commission’s proposal must be sent to the European Parliament and to the Member States for debate and, as the case may be, approval. A very relevant aspect, which did not occur with the e-Commerce Directive, is that this time European lawmakers have chosen to set forth these rules in a regulation. This means that once it has been approved, it will be directly applicable throughout the EU.