The Digital Services Act (DSA) is an EU regulation designed to create a safer and more transparent online environment. The DSA applies to all online platforms operating in the EU, with an external audit obligation for those with more than 45 million monthly active users. In 2025, the DSA further reshaped the operational reality for online platforms across the European Union (EU). The second audit cycle (2024/2025) expanded to 23 Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), revealing challenges related to transparency and user protection. Enforcement also intensified, with the European Commission (EC) issuing its first major fine (€120 million imposed on X) and securing binding commitments from TikTok.
The formal integration of the Code of Conduct on Hate Speech and the Code of Conduct on Disinformation, alongside the publication of guidelines on the protection of minors, further broadened the regulatory and auditable scope. Looking ahead to 2026, further enforcement actions are expected, together with expanded external audits and supervision under new delegated acts, Codes of Conduct and guidelines.
This article is a follow-up to our previous DSA article [Beem24].
Introduction
Given its purpose to protect users, combat illegal content and goods, increase transparency and give citizens greater control over their online experience, the DSA’s role became more pronounced in 2025. Having moved beyond its initial implementation phase, the DSA is now actively shaping the operational reality for online platforms across the EU. This shift became evident through the imposition of the first major fine, additional requests for information from the EC, binding commitments, the integration of specific Codes of Conduct as part of the DSA, the publication of the new guidelines on the protection of minors, as well as the adoption of the delegated act on access to data for vetted researchers. At national level, local supervisory authorities have begun setting up and opening initial investigations. This article reflects on the key developments related to the DSA in 2025 and considers prospective trends and implications for the future.
Second DSA audit cycle
In late November and early December 2025, the second cycle of external, independent audit reports for designated VLOPs and VLOSEs (see Figure 1) was published, with the exception of Temu and XNXX.com.
Figure 1. Designated VLOPs and VLOSEs.
In 2024, the first year of DSA audits, 19 VLOPs and VLOSEs were subject to external DSA audits. All audit reports included negative conclusions, with the exception of Wikipedia. In 2025, during the second audit cycle, the scope expanded to 23 VLOPs and VLOSEs, marking a first-year audit for four VLOPs publishing their audit reports for the first time. In this cycle, 19 platforms received negative opinions, while only one (Stripchat) received a fully positive opinion, and one (Snapchat) received a positive with comments opinion.
Compared to the first audit year, the second audit year shows a slight increase of “positive” opinions (from 72% to 79%), a slight reduction of “positive with comments” opinions (from 17% to 15%), while “negative” opinions are almost halved (from 11% to 6%) compared to last year ([KPMG26]). See Figure 2 for a diagram of the division of audit opinions per platform and per obligation.
Figure 2. DSA audit opinion per platform and per obligation.
The DSA moves into enforcement
In 2024/2025, the majority of the VLOPs received a Request for Information (RFI) from the EC, some of which resulted in ongoing investigations and binding commitments. Key enforcement themes include recommender systems, moderation of illegal content, systemic risk assessments, transparency reporting and online protection of minors.
In 2025, the DSA reached a major milestone in its enforcement trajectory, marking a significant step forward from initial investigations towards more assertive regulatory action. In fact, in December of 2025, the EC issued its first fine under the DSA, penalizing X (formerly Twitter) €120 million for shortcomings in ad transparency and for misleading users through its verification badge system. This case signaled that the EC is ready to move from oversight to decisive enforcement, with potential penalties reaching up to 6% of a company’s global annual turnover.
Enforcement under the DSA, however, does not necessarily take the form of financial sanctions. This became evident in the case of TikTok, concerning access to publicly available data for researchers. Rather than facing a fine, TikTok worked closely with the EC and made binding commitments to address concerns related to advertising transparency. These commitments include clearer and more accessible disclosures of sponsored content, enhanced labeling of paid promotions, improved visibility of advertising indicators, and the creation of a publicly accessible advertising repository with enriched metadata.
Integration of the Codes of Conduct into the DSA
In 2025, the DSA framework took another significant step forward with the formal integration of two Codes of Conduct. Originally conceived as voluntary instruments, these codes now fall under the DSA’s scope as part of Article 45, transforming soft commitments into structured compliance expectations.
The first code is the Code of Conduct on Countering Illegal Hate Speech Online + (Code of Conduct on Hate Speech), which was integrated into the DSA framework in 2025 (the original version of this Code was adopted in 2016). It establishes requirements for the timely removal of harmful content and the reporting of such content. The second code, which is the Code of Conduct on Disinformation, had functioned as a voluntary standard since 2018 and became an official DSA Code of Conduct in mid-2025. This development signals the EC’s intent to strengthen measures against election interference and algorithmic amplification of disinformation.
While participation in these codes remains voluntary, compliance by signatories will be monitored and assessed as part of annual audits. The Code of Conduct on Hate Speech was already partially included in the 2025 audit. Of the seven VLOPs that subscribed to the Code of Conduct on Hate Speech, five (Facebook, Instagram, LinkedIn, Snapchat and YouTube) included these results in their 2025 audit reports. TikTok had not yet included these commitments, while X did not report on them at commitment level.
The Code of Conduct on Hate Speech will continue to be included in the audit. The Code of Conduct on Disinformation will be introduced in the audit scope for the third audit cycle, covering the period of 2025–2026.
Introduction of the guidelines on the protection of minors
After a prolonged period of anticipation regarding guidance on Article 28 of the DSA on the protection of minors online, the EC published the final guidelines on the protection of minors on 14 July 2025. The guidelines build on the minors’ protection framework under Article 28(1) of the DSA. Article 28(1) states that “online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”. While this provision establishes a clear objective, it leaves considerable room for interpretation on what should be considered appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors. The guidelines help translate the high-level obligations of Article 28 into practical and actionable measures for online platforms. They are relevant for all online platforms with at least 50 employees and an annual turnover of €10 million or more.
Beyond these granular expectations, the guidelines introduce two broader thematic pillars. First, they introduce the requirement of conducting a risk review. The EC expects online platforms to conduct assessments to identify and mitigate risks that their services might present to the safety and protection of minors and implement measures based on the outcomes. This should determine an overall risk level for minors on the platform (low, medium, or high).
Another topic introduced in the guidelines is age assurance. Self-declaration alone is no longer considered sufficient. Instead, platforms are expected to deploy reliable and non-intrusive tools. While the guidelines describe several methods – age verification, age estimation and self-declaration – the emphasis is placed on determining the appropriateness and proportionality of the specific measures from each method in ensuring a high level of privacy, safety and security of minors. Particular attention is given to the suitability of age verification and age estimation, as well as the potential use of the EU Digital Identity Wallet or EU age verification solution (anonymized cryptographic tokens or zero-knowledge proofs).
The guidelines also cover other topics to protect minors on online platforms, such as online interface design, content moderation, default settings and account settings, and recommender systems. These include detailed expectations, for example with respect to AI features, which should not be activated by default and encourage or entice minors to use them. The guidelines also set granular requirements for default settings, such as disabling geolocation tracking, microphone access, photo access, and camera features by default.
Although non-binding, the guidelines are expected to shape external independent audits in 2025/2026 and influence DSA enforcement, underscoring that protecting minors has shifted from a peripheral concern to a core principle of responsible platform design – not only for VLOPs and VLOSEs, but for all online platforms with at least 50 employees and annual turnover of €10 million or more.
New delegated act on data access for qualified researchers
The EC adopted a delegated act outlining rules granting access to data for qualified researchers in July 2025 as part of Article 40 DSA (Data access and scrutiny). Qualified researchers are individuals affiliated with recognized research institutions who meet strict criteria to access non-public data from VLOPs and VLOSEs. Through the delegated act, the EC intends to provide researchers with an unprecedented scrutiny power, allowing them to effectively contribute to the monitoring of the online environment.
The delegated act lays down the technical conditions and harmonizes the procedures for the management of the data access process. It sets out which information Digital Services Coordinators (DSCs), VLOPs and VLOSEs must make public to facilitate vetted researchers’ applications to access relevant datasets. In addition, the EC launched the DSA data access portal, where researchers interested in accessing data under the new mechanism can find information and exchange with VLOPs, VLOSEs and DSCs on their data access applications.
The delegated act strengthens Article 40 by defining rules for researcher data access and introducing detailed technical and transparency requirements. Binding data access obligations for designated VLOPs and VLOSEs will directly impact external audits in 2025/2026 and the enforcement of the DSA under Article 40.
What’s next: emerging trends & challenges
The DSA framework shows an ongoing trend of expansion that is set to further refine the compliance landscape in 2026 and beyond.
Enforcement actions
In November 2025, the EC reaffirmed that the 45 million monthly active user threshold remains fit for purpose. For platforms hovering near the threshold, this confirmation underscores the need for proactive compliance planning. In 2026, as user bases grow and services expand across the EU, it is expected that new platforms may be designated as VLOPs or VLOSEs under the DSA.
Considering the first final decision of non-compliance in the case of X, it is expected that the EC will proceed with the other open investigations, many of which have been open since the first half of 2024, with the aim of closing them in the near future. The result of these cases is still to be determined, with the potential of agreements on binding commitments between the VLOPs/VLOSEs and the EC, or the imposition of a fine for non-compliance. However, the guidelines on the protection of minors might provide much-needed clarity on the cases related to Article 28, which could facilitate a quicker completion of those investigations.
DSA and the state of play of the EU digital regulatory landscape
In November 2025, the EC introduced the Digital Omnibus Regulation Proposal, which includes a set of technical amendments to a large corpus of digital legislation. The aim of the Omnibus is to reduce compliance costs and enhance legal clarity, while preserving the overarching objectives of existing laws. However, the DSA is not part of this proposed package. Considering the previously analyzed developments surrounding the DSA, it is noteworthy that the DSA exemplifies a regulatory framework that has undergone substantial operational and interpretative expansion since its adoption in 2022, without necessitating formal legislative amendments.
Audit period 2025–2026
In terms of the upcoming audit period of 2025–2026, the audit scope of the services will significantly broaden. This is mainly due to the introduction of the new delegated act on data access, the Codes of Conduct and the guidelines on the protection of minors.
The Codes apply only to its signatories; therefore, only the VLOP and VLOSE signatories undergo audit under the Codes. In 2026, audit reports are expected from all seven VLOP signatories of the Code of Conduct on Hate Speech. With regard to the Code of Conduct on Disinformation, all VLOP and VLOSE signatories are expected to perform their first audit year under the Code. The Code broadens the overall audit scope, as it places greater focus on disinformation in advertising placements, political advertising, the integrity of services and the empowerment of users, researchers and fact-checkers. This Code will further qualify the audit approach taken with respect to key DSA obligations (e.g., Articles 25, 26, 34, 35, 40, etc.), given the interrelated and informative nature of some of the topics, and corresponding findings, across the Code and the DSA.
Moreover, the guidelines on the protection of minors regarding Article 28 could be used as benchmarks (comply or explain) by the VLOPs and VLOSEs and challenged during the external, independent audits. Given the non-binding nature of the guidelines, it remains to be seen to what extent they will be considered and applied during the third audit cycle.
Lastly, the delegated act complements Article 40 of the DSA by outlining rules granting access to data for qualified researchers. This act will inform and further clarify the audit approach on Article 40, given the granularity of technical conditions and transparency requirements set out in the delegated act that current VLOPs and VLOSEs are required to meet.
Conclusion
The DSA stands out as one of the first EU regulations to impose an external audit obligation on VLOPs and VLOSEs, marking a significant shift toward accountability in the digital ecosystem. The progression from the first audit cycle in 2024 to the second in 2025 demonstrates measurable improvements in compliance, yet persistent challenges remain in areas such as transparency reporting, systemic risk mitigation and protection of minors. Enforcement actions – such as the first fine – signal the European Commission’s readiness to move from oversight to assertive intervention. With the integration of Codes of Conduct, new guidelines on the protection of minors, and the delegated act on data access, the regulatory scope continues to expand, shaping audits and enforcement for 2026 and beyond.
References
[Beem24] Beemdelust, A. van, Klein Tank, K. & Rietschoten, M. van (2024, December). From regulation to reality: the DSA’s early impact on trust and online safety. Compact 2024/2. Retrieved from: https://www.compact.nl/articles/from-regulation-to-reality-the-dsas-early-impact-on-trust-and-online-safety/
[KPMG26] KPMG (2026, January). Analysis of the 2024/2025 Digital Services Act audit reports. Retrieved from: https://kpmg.com/nl/en/home/insights/2025/03/kpmgs-2024-digital-services-act-dsa-audit-reports-benchmark.html

