TikTok is once again under regulatory scrutiny as the European Commission opens a formal probe to determine whether the ByteDance-owned video platform protects minors as required under the Digital Services Act (DSA). The inquiry, announced in Brussels, will examine whether TikTok complies with the new rules governing large online platforms that have become central to daily life for millions of children and teenagers across Europe.
The formal procedure, initiated by the European authorities, focuses on potential violations of the DSA, the contemporary framework that began applying to major tech companies this past weekend. The rules push for more transparency, demands risk mitigation for products and services, ensures researchers have broader access to data sets for academic purposes, and strengthens user privacy protections, particularly for younger users.
Thierry Breton, the European Commissioner for the Internal Market, underscored the importance of safeguarding young users. He noted that TikTok’s reach among youths comes with a responsibility to fully meet the DSA requirements and to take an especially attentive role in online child protection.
The investigation will look into several areas including how TikTok’s recommendation algorithms may influence users’ physical and mental wellbeing, the platform’s age-verification measures, privacy settings, and the nature of advertisements directed at younger audiences. In its briefing, the Commission highlighted concerns about possible shortcomings in granting researchers access to publicly available data from TikTok.
“The protection of minors is a top priority of the DSA,” the commission stated, signaling that the probe will dive deep into practices that could impact user safety and data access for oversight.
The process is described as a thorough, prioritized inquiry that could extend over several months. During this period, the Commission retains the option to take additional enforcement steps, including provisional measures and formal findings of non-compliance if warranted.
The scrutiny surrounding TikTok is not occurring in isolation. Earlier, the Commission opened another case against X, the social network formerly known as Twitter, for issues related to the publication of disinformation and violent content connected to ongoing global conflicts. The momentum suggests a broader push to ensure that major platforms adhere to the DSA’s obligations as gatekeepers of user data and online discourse.
Industry observers note that any penalties under the DSA can be substantial. If violations are confirmed, the platform could face fines of up to six percent of its global annual turnover, a cap that could translate into meaningful financial repercussions. The European authorities emphasize that the outcome of these proceedings will influence how large tech firms manage user safety, data access, and advertising practices across the bloc.
The ongoing inquiry reflects the European Union’s broader strategy to modernize digital governance. By prioritizing child protection and transparency, regulators aim to balance innovation with accountability while maintaining a consistent standard across the digital marketplace. Officials reiterate that enforcement tools may evolve as the assessment progresses, ensuring that the DSA remains an effective mechanism for safeguarding users in an ever-changing digital landscape.