<img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=6035250&amp;cv=2.0&amp;cj=1&amp;cs_ucfr=0&amp;comscorekw=Apps%2CGDPR%2CData+protection%2CChild+protection%2CSocial+media%2CUK+news%2CDigital+media%2CMedia%2CTechnology%2CChildren%2CTikTok"> Skip to main contentSkip to navigation Skip to navigation
TikTok app logo
TikTok received a record US fine earlier this year for illegally collecting personal data of children under 13. Photograph: Danish Siddiqui/Reuters
TikTok received a record US fine earlier this year for illegally collecting personal data of children under 13. Photograph: Danish Siddiqui/Reuters

TikTok under investigation over child data use

This article is more than 4 years old

UK inquiry looking at whether video-sharing app breaches data protection law

The video-sharing app TikTok is under investigation in the UK for how it handles the personal data of its young users, and whether it prioritises the safety of children on its social network.

Elizabeth Denham, the information commissioner, told a parliamentary committee the investigation began in February, prompted by a multimillion-dollar fine from the US Federal Trade Commission (FTC) for similar violations.

“We are looking at the transparency tools for children,” Denham said on Tuesday. “We’re looking at the messaging system, which is completely open, we’re looking at the kind of videos that are collected and shared by children online. We do have an active investigation into TikTok right now, so watch this space.”

As well as general concerns about how private data was collected, the commissioner said there were concerns about how the open messaging system allowed any adult to message any child.

She said the company was potentially violating the general data protection regulation (GDPR) which “requires the company to provide different services and different protections for children”.

Quick Guide

Children and tech

Show

Children and tech

Laws governing children's relationship with technology vary worldwide, and are rarely enforced. The de facto age for many online services is 13, set by the US Children’s Online Privacy Protection Act in 1998, which prevents sites from targeting children, or knowingly allowing children to provide information online without parental consent. The burden of garnering that consent and the low returns for building services for children has meant, however, that providers tended to turn a blind eye to under-13s on their sites, neither catering for them nor policing their presence.

That said, tech aimed more explicitly at children has blossomed recently, and legislation that aims to protect children from potential harm has been passed. Schoolchildren in France are barred by law from using their phones in school.

Such laws are countered by efforts on the part of companies such as Facebook and Google to attract new users while young. Facebook offers Messenger Kids, which lets children speak to contacts vetted by their parents, while Google’s YouTube has a Kids app that offers copious parental controls and the ability to filter videos for all but the most child-safe content – although the filters, which are run by an algorithm, haven’t always been successful, prompting the company to announce a human-curated version.

Proposed guidelines to improve child internet safety in the UK from the Information Commissioner’s Office in their 'Age appropriate design code' include:

  • Disabling 'nudge' techniques designed to keep children online for longer like 'streaks' on Snapchat or Facebook 'likes'
  • Limiting how children’s personal data is collected, used and shared by social media companies.
  • Making “high privacy” the default setting for children using social media platforms, including disabling geolocation tools and targeted advertising as standard, unless there is a compelling reason not to.
  • Requiring social media companies to show that all staff involved in the design and development of services likely to be used by children comply with the code of practice.
  • Introducing robust age verification checks on platforms or treat all users as if they are children.
Was this helpful?

In February, Bytedance, the Chinese firm that owns TikTok, was fined a record ($5.7m) £4.2m for illegally collecting personal information from children under 13. The FTC said the company had previously been aware that “a significant percentage of users were younger than 13”, the age at which US laws mandate strict data protections, “and received thousands of complaints from parents that their children under 13 had created Musical.ly accounts”. Musical.ly was the previous name of TikTok.

Despite that, the FTC’s chair, Joe Simons, said, “they still failed to seek parental consent before collecting names, email addresses and other personal information from users under the age of 13”.

Bytedance, a private startup based in Beijing, has a valuation of $75bn, based primarily on the extraordinary growth of TikTok and its Chinese equivalent, Douyin. The app is popular among teenagers and preteens for its combination of music and meme-based humour.

In April, Lil Nas X found fame overnight when his track Old Town Road was used extensively on 15-sec clips on the social network. That enthusiasm took the artist to the top of the US Billboard Hot 100 chart.

A company can be fined up to €20m (£17.9m), or 4% of revenue, whichever is higher, for violating the GDPR. As a private company, Bytedance does not have to disclose revenue so it is unknown how high such a fine could be.

In a statement, TikTok said: We cooperate with organizations such as the ICO to provide relevant information about our product to support their work. Ensuring data protection principles are upheld as a top priority for TikTok.”

More on this story

More on this story

  • ‘It’s tough for parents’: should young children have their own phone?

  • Prosecute tech chiefs who endanger children, says Molly Russell’s father

  • Adult online age used by third of eight- to 17-year-old social media users

  • Almost half of children in England have seen harmful content online – survey

  • Voice assistants could ‘hinder children’s social and cognitive development’

  • TikTok could face £27m fine for failing to protect children’s privacy

  • Primary-age children’s screen time went up by 83 minutes a day during pandemic – study

  • Planned EU rules to protect children online are attack on privacy, warn critics

  • YouTube Kids shows videos promoting drug culture and firearms to toddlers

Most viewed

Most viewed