Select an area of expertise to find out more about our experience.
Find out more about our barristers and business support teams here.
Children’s privacy consistently features, with cyber security, as a top data protection concern in the UK.[1]
This month (April 2023), the ICO issued a £12.7 million fine to TikTok Information Technologies UK Limited and TikTok Inc (TikTok) – the China-owned social media video sharing app – for breaching the UK GDPR.
The ICO determined that the following breaches took place between May 2018 and July 2020:
A further provisional finding that TikTok had unlawfully processed special category data was not pursued – saving TikTok from a significantly larger fine of £27million.
Following its investigation into TikTok, the ICO published the Children’s code or Age Appropriate Design Code – a statutory[2] data protection code of practice for online services such as apps, online games, and social media sites, likely to be accessed by children. It translates the GDPR requirements into design standards, providing 15 key principles.
The express purpose of the Code is not to protect children from the digital world but to protect them from within it. Failure by organisations to comply with the Code can result in enforcement action including compulsory audits, orders to stop processing and fines of up to 4% of global turnover.
In September 2022, the ICO clarified that adult-only services are in scope of the Code if they are likely to be accessed by children – businesses are therefore well advised to have regard to the Code even if children are not their target audience. The ICO has developed draft guidance to assist in assessing whether children are likely to access particular services. Its case studies include online dating services, pornography sites, gaming, and social media platforms. Age assurance is a particularly topical issue that vexes online businesses and which has troubled the Online Safety Bill. There is presently an opportunity for affected companies, stakeholders and those with an interest to participate in the ICO’s consultation regarding the ICO’s guidance for ‘likely to be accessed’ in the context of the Children’s Code. The deadline for participation in the consultation is 19 May 2023.[3]
[1] See, for example, research conducted by the ICO, Ofcom and the London School of Economics https://ico.org.uk/for-organisations/childrens-code-hub/ [2] S123 Data Protection Act 2018 [3] ICO consultation on the draft guidance for ‘Likely to be accessed’ in the context of the Children’s code | ICOFurther reading: Click here for the ICO’s Children’s Code Hub
A monthly data protection bulletin from the barristers at 5 Essex Chambers
The Data Brief is edited by Francesca Whitelaw KC, Aaron Moss and John Goss, barristers at 5 Essex Chambers, with contributions from the whole information law, data protection and AI Team.