The Data Brief

A monthly data protection bulletin from the barristers at 5 Essex Chambers

The Clock is TikToking for Misuse of Children’s Data

26 April 2023

Children’s privacy consistently features, with cyber security, as a top data protection concern in the UK.[1]

This month (April 2023), the ICO issued a £12.7 million fine to TikTok Information Technologies UK Limited and TikTok Inc (TikTok) – the China-owned social media video sharing app – for breaching the UK GDPR. 

The ICO determined that the following breaches took place between May 2018 and July 2020:

  • Processing the personal data of UK children under the age of 13 without consent or authorisation from their parents or carers (the ICO estimated that more than a million children under 13 were using the site without consent);
  • Failing to provide proper information to users of the platform regarding how data is collected, used and shared in a way that is easy to understand so that users, in particular children, were able to make informed choices about whether and how to engage with TikTok;
  • Failing to ensure that the personal data of UK users was processed lawfully, fairly and transparently.   

A further provisional finding that TikTok had unlawfully processed special category data was not pursued – saving TikTok from a significantly larger fine of £27million.

Following its investigation into TikTok, the ICO published the Children’s code or Age Appropriate Design Code – a statutory[2] data protection code of practice for online services such as apps, online games, and social media sites, likely to be accessed by children. It translates the GDPR requirements into design standards, providing 15 key principles.

The express purpose of the Code is not to protect children from the digital world but to protect them from within it. Failure by organisations to comply with the Code can result in enforcement action including compulsory audits, orders to stop processing and fines of up to 4% of global turnover.

In September 2022, the ICO clarified that adult-only services are in scope of the Code if they are likely to be accessed by children – businesses are therefore well advised to have regard to the Code even if children are not their target audience. The ICO has developed draft guidance to assist in assessing whether children are likely to access particular services. Its case studies include online dating services, pornography sites, gaming, and social media platforms. Age assurance is a particularly topical issue that vexes online businesses and which has troubled the Online Safety Bill. There is presently an opportunity for affected companies, stakeholders and those with an interest to participate in the ICO’s consultation regarding the ICO’s guidance for ‘likely to be accessed’ in the context of the Children’s Code. The deadline for participation in the consultation is 19 May 2023.[3]

[1] See, for example, research conducted by the ICO, Ofcom and the London School of Economics 

[2] S123 Data Protection Act 2018

[3] ICO consultation on the draft guidance for ‘Likely to be accessed’ in the context of the Children’s code | ICO

Further reading: Click here for the ICO’s Children’s Code Hub

The Data Brief

A monthly data protection bulletin from the barristers at 5 Essex Chambers

The Data Brief is edited by Francesca Whitelaw KC, Aaron Moss and John Goss, barristers at 5 Essex Chambers, with contributions from the whole information law, data protection and AI Team.

Visit the Information Law, Data Protection and AI area

Search The Data Brief

Portfolio Builder

Select the practice areas that you would like to download or add to the portfolio

Download    Add to portfolio   
Title Type CV Email

Remove All


Click here to share this shortlist.
(It will expire after 30 days.)