The DSA, towards a new accountability framework for platforms ?

The DSA, towards a new accountability framework for platforms ?

The DSA, towards a new accountability framework for platforms ?

article précédent

article suivant

What is the Digital Services Act (DSA)?

The Digital Services Act (or DSA) is the legislation regulating digital services. It aims to create a safe digital space for European consumers. Proposed in 2020 by the European Commission, the DSA reached a political agreement between the Parliament and the Council in April 2022, which was adopted on July 5, 2022 by the European Parliament.
Following the Council’s approval on 4 October 2022, the DSA was adopted after 2 years of negotiation.

More specifically, the DSA requires intermediary services, such as cloud services, social networks or e-commerce platforms, to fight against illegal and harmful content speech, the trade of illegal goods and services, disinformation or other societal risks, and the misuse of online services.

Who is concerned?

The DSA is targeted to intermediary services, in particular hosting services, i.e. services that store the information provided by, and at the request of, the recipient.

Among hosting services, the DSA distinguishes between :

  1. Online platforms i.e. hosting services which, at the request of the user, store and disseminate to the public information.

  2. Very large online platforms i.e. online platforms which provide their services to a number of average monthly users active recipients of the service in the EU equal or higher than 45 million.

What will the DSA change in the European Union?

(1) Content moderation

The DSA intends to prevent the dissemination of illegal content such as hate speech or disinformation. Currently, hosting services are not liable for the content they manage as long as they have no knowledge of the illegal content and, when they gain such knowledge, act expeditiously to remove or disable access to the information.

With the DSA, hosting services will have to moderate their content, and to allow users to notify any illegal or inappropriate content they may encounter. Regarding notification, platforms must react quickly : for instance, they are required to act without delay when the content is notified as a life threat or people’s safety. Entities specialised in the identification of illegal content, called trusted flaggers, will be provided a special and priority channel of notification. Once the hosting service is notified, it is considered to have knowledge of the illegal content, and therefore is responsible for hosting it.

Content moderation of very large online platforms is frequently denounced by Non Governmental Organisations.

For instance, in March 2021, Reporters without borders filed a complaint against Facebook arguing that Facebook’s Statement of maintaining a safe environment was not satisfied by its lacking content moderation. In particular, Reporters without borders highlighted that degrading comments about journalists, including comments containing death threats, were visible and even appeared among the most relevant comments.

(2) Redress mechanisms

The DSA also prevents hosting services from violating users’ freedom of expression by prohibiting the deletion or the dispublishment of their content without a relevant reason.

Therefore, hosting services will have to explain to the author of the content why the content is considered illegal or harmful, and how the author can appeal the decision via a statement of reasons. The author of content will be able to appeal this decision through a mediation service, or, for very large online platforms, through an internal appeal mechanism.

While content moderation is needed, some platforms have been handling this issue without sufficient consideration for the users’ freedom of expression.

In 2020, Youtube widely used an AI content moderation system which deleted 11 million videos without human supervision. However, the AI system turned out to be overeager, as 160 000 videos were put back online after their authors’ complained.

(3) Fight against the trade of illegal goods and services

The DSA aims to fight against the trade of illegal goods and services and criminal offences. Very large online platforms are therefore bound to notify law enforcement and judicial entities of suspicions of criminal offences.

Also, to avoid untraceable trade of illegal goods and services, traders will be allowed to use very large online platforms to promote messages on or to offer products and services to users only if they prove their identity and disclose some banking and commercial information.

EU product safety laws are meant to protect the European market of harmful and dangerous products. However, such products still find an entry to the European market through online platforms.

For instance, in February 2020, the European Consumer Organisation purchased 250 goods from online marketplaces including Amazon, Aliexpress, eBay and Wish, targeting products based on possible risks. 66% of these products failed EU safety laws, resulting in risks of electric shock, fire damages, or suffocation.

(4) Risk assessment and mitigation

The DSA promotes transparency and responsibility.

Responsibility is provided by the requirement for online platforms to carry out a risk-based approach in order to protect and prevent the misuse of their services.

Transparency of digital services is promoted through the obligation for hosting services to provide users transparent and clear terms of uses, and to provide to the public transparency report on their content moderation, dispute settlement, and risk-assessment policy. The DSA also prohibits online platforms to nudge people into using their services.

When Elon Musk had expressed interest in buying Twitter, he said : “By ‘free speech’, I simply mean that which matches the law. I am against censorship that goes far beyond the law”.

If this vision were to be adopted by Twitter it would be in conflict with the risk-based approach of the DSA which requires very large online platforms to mitigate the risks their systems pose for protecting fundamental rights, public interests, public health and security. Such an approach requires very large online platforms not only to moderate illegal content, but also to protect users against harmful content.

This is why Elon Musk was strongly reminded by EU and Member States officials that Twitter will have to comply with the DSA. To this day, Elon Musk has reversed his decision to buy Twitter.

(5) Enhancing transparency

Finally, the DSA is meant to tackle the lack of transparency of profiling. According to the DSA, advertising should be transparent for the user of the service, as well as recommender systems. Concerning targeted advertising, the DSA will also restrict the use of sensitive data and ban targeted acts to minors.

Likewise, Dark patterns on online interfaces of online platforms or practices aimed at influencing, deceiving or altering the choice or behavior of users will be prohibited.

Transparency requirements are also targeted as content moderation practices. It means that any information on policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, as well as the rules of procedure of their internal complaint handling system shall be included on the intermediary service’s terms and conditions.

The lack of transparency of targeted ads showed how dangerous it can be for democracy through the Cambridge Analytica scandal, where data was fraudulently collected to influence the Brexit vote and the Trump election. It led platforms to take measures.

However, in March, the Mozilla Foundation found that the ban of political ads was not sufficiently enforced on TikTok as the platform does not seem to monitor influencers’ advertising and does not disclose paid relationships with political groups.

(6) Crisis mechanism

In the context of the war in Ukraine, the DSA introduced a crisis response mechanism, especially with the impact on the manipulation of online information. “This mechanism will make it possible to analyse the impact of the activities of very large online platforms (VLOPs) and very large online search engines (VLOSEs) on the crisis in question and rapidly decide on proportionate and effective measures to ensure the respect of fundamental rights.”^[https://www.consilium.europa.eu/en/press/press-releases/2022/10/04/dsa-council-gives-final-approval-to-the-protection-of-users-rights-online/]

How will the DSA be enforced?

Member States are required to designate one or more competent authorities as responsible for the enforcement of the DSA, and designate one of these authorities as their Digital Services Coordinators. The Digital Services Coordinators will be national competent authorities to enforce the provisions of the DSA.

They will be given powers of investigation and power to impose fines, which cannot exceed 6% of the annual income or turnover of the intermediary service.

Conclusion

According to the Council of the EU, “​The DSA is considered a world first in the field of digital regulation: no other legislative act has this level of ambition as regards regulating platforms and online supervision while preserving the core principles of the internal market” ^[https://www.consilium.europa.eu/en/press/press-releases/2022/10/04/dsa-council-gives-final-approval-to-the-protection-of-users-rights-online/]
The DSA will start to apply in 2024, and is expected by some to become the world « gold standard » in regard to online platforms regulation.

The DSA will all the more regulate the digital spaces because it is included in a legislative package of protection aimed at framing digital platforms, including the DMA that we analyzed here

Eloïse Quinzin

Eloïse Quinzin

December 13, 2022

Never miss GDPR compliance news and best practices anymore

Subscribe to our newsletter

Receive our newsletter about data protection