Europe’s New Law Will Force Secretive TikTok to Open Up

The Digital Services Act will extract more new information from the young app than from older platforms like Facebook.
White balls lit with pink and blue light passing through a hole in a white wall
Photograph: twomeows/Getty Images

Social networks grow up faster these days. It took Facebook eight years to reach 1 billion users, but TikTok got there in just five. The fast-growing short-video app also got squeezed by political and regulatory concerns at a younger age over its Chinese ownership and influence on teen mental health.

The pressure on TikTok is now set to jump higher still. The European Union’s recently agreed-upon Digital Services Act (DSA) places new restrictions on the largest platforms, a reaction to the way established platforms like Facebook and YouTube have been used to undermine elections, promote genocide, and spread dangerous conspiracy theories. But the new rules are likely to bring about bigger changes on TikTok than on more established platforms.

To date, TikTok has been less transparent and less thoroughly studied than Facebook, Instagram, and YouTube. That’s partly because it is a much younger service, and fewer researchers and journalists have scrutinized its workings. But TikTok has also not provided tools to enable researchers to study how content circulates on its platform, as Facebook and Twitter have done. When Europe’s new rules force all large social platforms to open up their data and even algorithms to outside scrutiny, our understanding of TikTok may change most of all.

The DSA is aimed at reducing online harms, such as harassment, and making major online platforms more accountable for their effects on elections and other aspects of society, with large social networks and search engines the primary targets. The law was agreed upon late last month, weeks after the passage of a companion law aimed at tech monopoly power. “With today’s agreement we ensure that platforms are held accountable for the risks their services can pose to society and citizens,” European Commission executive vice president Margarethe Vestager said of the DSA decision. The law’s legal text is now being finalized, and it could take effect as soon as January 2024. As with Europe’s GDPR data protection law, the DSA may alter how tech companies around the world operate.

Earlier drafts of the DSA and details confirmed after negotiations concluded clearly suggest that the law will force major changes in the way social networks operate. The toughest measures are reserved for platforms with more than 45 million active users in the EU. TikTok said in 2020 that it had more than 100 million users in Europe.

TikTok declined to answer questions about what changes it might have to make to comply with the DSA. Spokesperson Brooke Oberwetter said that TikTok welcomed the DSA’s “focus on transparency as a means to show accountability” and that the company was “intent” on furthering its work “to build trust through transparency” with its users.

Experts say such transparency has been lacking. Russia’s invasion of Ukraine has highlighted TikTok’s power and its inscrutability. Soon after the war began in February, TikTok became central to the spread of rumors and videos from the war. But researchers at nonprofits and in academia cannot easily monitor how such content is spreading because the company doesn’t offer APIs that enable study of its platform, as Facebook and Twitter do. Social media research collective Tracking Exposed had to use software that surfs and scrapes TikTok to uncover how the company quietly constrained the content available to users in Russia.

The requirements the DSA lays on large platforms could provide a much more complete picture in future emergencies that play out online. A “crisis mechanism” in the law allows the European Commission to order the largest platforms to restrict certain content in response to a security or health emergency. The DSA also requires big platforms to provide “vetted” external researchers with access to the data needed to study online risks at all times. “Data access is a game-changer,” says Alex Engler, a fellow at the Brookings Institution think tank who studies the social impact of algorithms. “It will allow systematic evaluation of the actual outcomes and effects of these platforms and can change the societal insight we can have into these public squares.”

Mandatory data access could be particularly revealing for TikTok, given the relative dearth of studies on the platform and the fact that it does not currently provide data access for researchers. The law may also help researchers who have squabbled with Meta over limits the company places on what data it will provide from Facebook, and its lack of an API for Instagram.

Further insight could flow from the DSA’s requirement that large platforms provide regulators access to their recommendation algorithms—in TikTok’s case potentially opening up the code behind its notoriously addictive For You page. The company has already established programs in the US and Europe that brief invited experts on its recommendation and moderation systems, but these don’t offer the ongoing and granular transparency access code and data may provide. Large platforms will also be required to provide users with an alternative to personalized recommendations or social feeds. It is unclear whether such alternatives could include the part of TikTok’s app that shows videos from accounts a user follows, or options like Facebook and Twitter provide to rank posts chronologically instead of based on a person’s activity.

The DSA also requires large platforms to dedicate their own resources to monitoring their services. For Facebook or TikTok that might mean working to ensure their moderation and ranking systems don’t penalize or favor certain groups or content. Platforms will be subject to external audits each year to check that they have properly monitored for risks and taken appropriate action.

That could force TikTok to hire more internal researchers to probe what happens on its platform. Meta and Twitter already have significant research teams that sometimes publish academic papers on their findings. Those studies are not wholly independent of corporate influence, but they can still be instructive, says Casey Fiesler, a professor who researches tech ethics and internet law at the University of Colorado, Boulder. A recent Twitter study found that across six countries, including the US, its feed-ranking algorithm amplified posts from elected officials on the political right more than posts from those on the left. “I’m not seeing TikTok publish papers with that kind of research,” says Fiesler, who also maintains a popular TikTok account.

Fiesler says the DSA’s requirement for self-reflection could help offset the incentives companies have to avoid looking for problems on their own platforms. Many of the revelations about Facebook and Instagram made public by ex-Facebook employee Frances Haugen last year came from internal research into problems like moderation errors in Arabic and teens’ use of Instagram. A company seeking to avoid damaging leaks might choose to stop doing internal studies altogether, but the DSA aims to shut that sort of thinking down.

The law also takes aim at platforms’ processes for moderating content. It gives users the right to challenge content moderation decisions and to seek redress. Drafts of the law included a provision that platforms of any size be required to state the reasons for removing content. Fiesler says that could force changes to TikTok’s moderation system, which in her experience often fails to clearly explain why a video has been removed, or why the company has decided to reverse or maintain that decision after a user clicks a button to appeal a removal. “The moderation system on TikTok is one of the most frustrating I have ever dealt with,” she says.

New transparency rules for moderation could be easier to handle for older companies like Facebook and Twitter, which have over time developed more elaborate systems to inform users which rule or rules have been broken when content is removed. Facebook’s appeals process includes an oversight board of outside experts who rule on some contested decisions. Despite its grandiose title, the board has been criticized for being ineffective and too small to make a difference, but no other large social network has established a similar mechanism.

The DSA will also ban the use of data from minors to serve ads. The exact age cutoff and implementation details are still unclear, but TikTok might be hit harder than other large platforms because its userbase skews younger than that of rivals such as Instagram. When investment bank Piper Sandler surveyed 7,100 US teens with an average age of 16 this year, a third said TikTok was their favorite social platform, ahead of Snapchat and Instagram.

TikTok currently allows ads to be targeted to users as young as 13. In 2020, The New York Times reported that internal documents from TikTok showed the company estimated more than a third of its US users were 14 or younger. Musical.ly, an app folded into TikTok in 2018, paid $6 million in 2019 to settle allegations by the US Federal Trade Commission that the company had breached a US law that requires parental consent to collect personal data from children under 13.

The DSA’s requirements on the largest platforms will be enforced by staff at the European Commission, the EU’s executive branch. That’s a different approach from the way individual member states enforce the EU’s GDPR data protection law. As a result, the data regulator of Ireland, which is home to many tech company offices, has been swamped—much to the annoyance of more populous countries, like Germany.

Even though enforcement of the DSA applies only to Europe, it is likely to change tech platform comportment outside the EU, including in the US. Anu Bradford, a professor at Columbia Law School, says many companies may choose to offer the same protections to users outside the EU, as Microsoft and some others did for certain rights introduced by GDPR.

The reasons are both technical and political. Bradford says globe-spanning platforms may struggle to create new, DSA-compliant ways of operating just for the EU, while users and politicians outside of Europe may balk at the idea that greater protections are offered to users elsewhere. “Americans are going to ask ‘Why are you protecting Europeans and their minors but choosing not to do the same for us in the US?’” she says.


More Great WIRED Stories