EU opens formal probe of TikTok under Digital Services Act, citing child safety, risk management and other concerns

The European Union is formally investigating TikTok’s compliance with the bloc’s Digital Services Act (DSA), the Commission has announced.

Areas the Commission is focusing on in this investigation of TikTok are linked to the protection of minors, advertising transparency, data access for researchers, and the risk management of addictive design and harmful content, in said in a press release.

The DSA is the bloc’s online governance and content moderation rulebook which, since Saturday, has applied broadly to — likely — thousands of platforms and services. But since last summer, larger platforms, such as TikTok, have faced a set of extra requirementsm, in areas like algorithmic transparency and systemic risk, and it’s those rules that the video-sharing platform is now being investigated under.

Today’s move follows several months of information gathering by the Commission, which enforces the DSA rules for larger platforms — including requests for information from TikTok in areas including child protection and disinformation risks.

Although the EU’s concerns over TikTok’s approach to content governance and safety predate the DSA coming into force on larger platforms. And TikTok was forced to make some operational tweaks before, back in June 2022, after regional consumer protection authorities banded together to investigate child safety and privacy complaints.

The Commission will now step up its information requests to the video sharing platform as it investigates the string of suspected breaches. This could also include conducting interviews and inspections as well as asking it to send more data.

There’s no formal deadline for the EU to conclude these in-depth probe — its press release just notes the duration depends on several factors, such as “the complexity of the case, the extent to which the company concerned cooperates with the Commission and the exercise of the rights of defence”.

TikTok was contacted for comment on the formal investigation. A company spokesperson emailed us this statement:

TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with. We’ll continue to work with experts and industry to keep young people on TikTok safe, and look forward to now having the opportunity to explain this work in detail to the Commission.

In its press release, the Commission says the probe of TikTok’s compliance with DSA obligations in the area of systemic risks will look at “actual or foreseeable negative effects” stemming from the design of its system, including algorithms. The EU is worried TikTok’s UX may “stimulate behavioural addictions and/or create so-called ‘rabbit hole effects’”.

“Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalisation processes,” it further writes.

The Commission is also concerned that mitigation measures TikTok has put in place to protect kids from accessing inappropriate content — namely age verification tools — “may not be reasonable, proportionate and effective”.

It will therefore also look at whether TikTok is complying with “DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems”.

Elsewhere, the bloc’s probe will look at whether TikTok is fulfilling the DSA requirement to provide “a searchable and reliable repository” for ads that run on its platform.

Also on transparency, the Commission says its investigation concerns “suspected shortcomings” by TikTok when it comes to providing researchers with access to publicly accessible data on its platform so they can study systemic risk in the EU — with such data access being mandated by Article 40 of the DSA.

Commenting in a statement, Margrethe Vestager, EVP for digital, said:

The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users – young as well as old. The Commission will now carry out an in-depth investigation without prejudice to the outcome.

In another supporting statement internal market commissioner, Thierry Breton, emphasized that: “The protection of minors is a top enforcement priority for the DSA.”

“As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online,” he added. “We are launching this formal infringement proceeding today to ensure that proportionate action is taken to protect the physical and emotional well-being of young Europeans. We must spare no effort to protect our children.”

It’s the second such proceeding under the DSA, after the bloc opened a probe on Elon Musk-owned X (formerly Twitter) in December, also citing a string of concerns.

Penalties for confirmed breaches of the DSA can reach up to 6% of global annual turnover. Once an investigation has been opened EU enforcers can also access a broader toolbox, such as being able to take interim measures prior to a formal proceeding being wrapped up.

The EU may also accept commitments offered by a platform under investigation if they are aimed at fixing the issues identified.

Around two dozen platforms are subject to the DSA’s algorithmic transparency and systemic risk rules. These are defined as platforms with more than 45 million regional monthly active users. In TikTok’s case the platform informed the bloc last year that it had 135.9M monthly active users in the EU.

The Commission’s decision to open a child protection investigation on TikTok means Ireland’s media regulator, which is responsible for oversight of TikTok’s compliance with the rest of DSA rules, under the decentralized, ‘country of origin’ enforcement structure the EU devised for enforcing the bulk of the regulation, won’t be able to step in and supervise the platform’s compliance in this area. It will be solely up to the Commission to assess whether or not TikTok has put in place “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors”. 

Leave a Reply

Your email address will not be published. Required fields are marked *