The Commission adopted the first designation decisions under the Digital Services Act (DSA), designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users. These are:
Very Large Online Platforms:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- com
- Google Play
- Google Maps
- Google Shopping
- Snapchat
- TikTok
- Wikipedia
- YouTube
- Zalando
Very Large Online Search Engines:
- Bing
- Google Search
The platforms have been designated based on the user data that they had to publish by 17 February 2023.
Next steps for designated platforms and search engines
Following their designation, the companies will now have to comply, within four months, with the full set of new obligations under the DSA. These aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools. This includes:
- More user empowerment:
- Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
- Users will be able to report illegal content easily and platforms have to process such reports diligently;
- Advertisements cannot be displayed based on the sensitive dataof the user (such as ethnic origin, political opinions or sexual orientation);
- Platforms need to label all ads and inform users on who is promoting them;
- Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.
- Strong protection of minors:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for negative effects on mental health will have to be provided to the Commission 4 months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.
- More diligent content moderation, less disinformation:
- Platforms and search engines need to take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
- Platforms need to have clear terms and conditions and enforce them diligently and non-arbitrarily;
- Platforms need to have a mechanism for users to flag illegal content and act upon notifications expeditiously;
- Platforms need to analyse their specific risks, and put in place mitigation measures – for instance, to address the spread of disinformation and inauthentic use of their service.
- More transparency and accountability:
- Platforms need to ensure that their risk assessments and their compliance with all the DSA obligations are externally and independently audited;
- They will have to give access to publicly available data to researchers; later on, a special mechanism for vetted researchers will be established;
- They will need to publish repositories of all the ads served on their interface;
- Platforms need to publish transparency reports on content moderation decisions and risk management.
By 4 months after notification of the designated decisions, the designated platforms and search engines need to adapt their systems, resources, and processes for compliance, set up an independent system of compliance and carry out, and report to the Commission, their first annual risk assessment.
Risk assessment
Platforms will have to identify, analyse and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom. Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated. The risk mitigation plans of designated platforms and search engines will be subject to an independent audit and oversight by the Commission.
A new supervisory architecture
The DSA will be enforced through a pan-European supervisory architecture. While the Commission is the competent authority for supervising the designated platforms and search engines, it will work in close cooperation with the Digital Services Coordinators in the supervisory framework established by the DSA. These national authorities, which are responsible as well for the supervision of smaller platforms and search engines, need to be established by EU Member States by 17 February 2024. That same date is also the deadline by which all other platforms must comply with their obligations under the DSA and provide their users with protection and safeguards laid down in the DSA.
To enforce the DSA, the Commission is also bolstering its expertise with in-house and external multidisciplinary knowledge and recently launched the European Centre for Algorithmic Transparency (ECAT). It will provide support with assessments as to whether the functioning of algorithmic systems is in line with the risk management obligations. The Commission is also setting up a digital enforcement ecosystem, bringing together expertise from all relevant sectors.
Access to data for researchers
The Commission also launched a call for evidence on the provisions in the DSA related to data access for researchers. These are designed to better monitor platform providers’ actions to tackle illegal content, such as illegal hate speech, as well as other societal risks such as the spread of disinformation, and risks that may affect the users’ mental health. Vetted researchers will have the possibility to access the data of any VLOP or VLOSE to conduct research on systemic risks in the EU. This means that they could for example analyse platforms’ decisions on what users see and engage with online, having access to previously undisclosed data. In view of the feedback received, the Commission will present a delegated act to design an easy, practical and clear process for data access while containing adequate safeguards against abuse. The consultation will last until 25 May.
Background
On 15 December 2020, the Commission made the proposal on the DSA together with the proposal on the Digital Markets Act (DMA) as a comprehensive framework to ensure a safer, more fair digital space for all. Following the political agreement reached by the EU co-legislators one year ago, in April 2022, the DSA entered into force on 16 November 2022.
The DSA applies to all digital services that connect consumers to goods, services, or content. It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users’ rights online, and places digital platforms under a unique new transparency and accountability framework. Designed as a single, uniform set of rules for the EU, these rules will give users new protections and businesses legal certainty across the whole single market. The DSA is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.
Source: European Commission | News (https://bit.ly/3Nfr6T8)