UK regulator hits TikTok with £12.7m fine over child data breaches

TikTok has been hit with a £12.7m fine by the UK regulator, The Information Commissioner’s Office (ICO) for a number of breaches of data protection law, including failing to use children’s personal data lawfully.

The ICO estimates that TikTok allowed up to 1.4m UK children under 13 to use its platform in 2020, despite its own rules not allowing children that age to create an account.

UK data protection law says that organisations that use personal data when offering information society services to children under 13 must have consent from their parents or carers.

The ICO said that TikTok failed to do that, even though it ought to have been aware that under 13s were using its platform. It said that TikTok also failed to carry out adequate checks to identify and remove underage children from its platform.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately.

The ICO found that TikTok breached the UK General Data Protection Regulation (UK GDPR) between May 2018 and July 2020 by providing its services to UK children under the age of 13 and processing their personal data without consent or authorisation from their parents or carers. By failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. And by failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner.

The original ICO notice of intent for TikTok set the fine at £27m. Taking into consideration the representations from TikTok, the regulator decided not to pursue the provisional finding related to the unlawful use of special category data. That meant that this potential infringement was not included in the final amount of the fine set at £12.7.

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws,” said UK Information Commissioner, John Edwards. “As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s code to help protect children in the digital world. It is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. There’s more information on that here.  

We asked TikTok for their reaction to the fine. A spokesman told us:
“TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community. While we disagree with the ICOs decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.” 

Ryan Gracey, Partner and data privacy expert at law firm, Gordons, said that while the fine may be well below what the ICO initially threatened, it is still a significant penalty and one of the largest ever given by the ICO.

“It’s another acute reminder that technology companies must take steps to protect personal data, especially the data of children online,” he said. “In particular, businesses need to be aware of the ICO’s statutory code of practice known as the ‘Children’s Code’ which sets out a series of standards they expect businesses to follow when designing and building online services which may be used by children.

“The standards include using clear language in ‘bite-size’ chunks for children to tell them what they are doing with the user’s personal data, being open about the risks and safeguards involved, and letting the user know what to do if they are unhappy.”

Popular topics