EU enacts Digital Services Act strengthening online safety and governance

The DSA aims to ensure online safety for users and covers various platforms such as social media, e-commerce, search engines, hosting providers, and app stores.

EU flags in front of European Commission

The Digital Services Act (DSA) came into effect on Saturday, 17 February 2024, marking a significant step by the EU to create a safer, fairer, and more transparent online environment, applying to all online intermediaries serving users within the EU. This legislation aims to enhance online safety for users and covers a broad spectrum of platforms, including social media, e-commerce, search engines, hosting providers, and app stores. Initially implemented for 19 designated large platforms and search engines in August 2023, it now extends to other intermediaries, with less stringent regulations for certain entities exempt from the law – specifically, small and micro enterprises employing fewer than 50 persons with an annual turnover below €10 million.

The DSA and the Digital Markets Act (DMA) represent the EU’s latest regulatory efforts to govern the online space. Under the DSA, the EU users are afforded greater protection against illegal goods and content, ensuring their rights are upheld on online platforms where they engage with others, share information, or make purchases, as the European Commission notes.

The DSA mandates several responsibilities for online platforms operating within the EU. These include implementing measures to allow users to flag illegal content, prohibiting personalised ads based on sensitive data, preventing personalised ads targeting minors through profiling or personal data, enhancing the traceability of sellers on online marketplaces, prohibiting the use of dark patterns that manipulate user choices, and implementing transparent content moderation procedures. Additionally, platforms must provide clear terms and conditions, disclose the main parameters of their content recommendation algorithms, designate a point of contact for authorities and users, and offer detailed information about advertisements, including reasons for display and the advertiser’s identity.

Separate rules apply for the 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) designated last year. These platforms must take risk-based action, undergo independent audits of their risk management measures, and mitigate risks such as disinformation, election manipulation, cyber violence against women, or harm to minors online. Compliance oversight for VLOPs and VLOSEs falls under the purview of the European Commission, while Digital Services Coordinators in each EU Member State will oversee compliance for all other intermediaries. The European Board for Digital Services, formed by Digital Services Coordinators and the Commission, will ensure coordination and consistent application of the law across jurisdictions.