The Data Brief

A monthly data protection bulletin from the barristers at 5 Essex Chambers

Teen Accounts- A New Dawn for Social Media?

2 October 2024

In September 2024 Meta (owner of Facebook, Whatsapp, Instagram etc) announced that it will be releasing ‘Teen Accounts’. Given that Meta has almost 4 billion users a month, this signals an important shift in the approach taken by social media giants towards children’s privacy.

What are Teen Accounts?

Teen Accounts will automatically be given to all new users under the age of 16. Such accounts have a variety of protective measures enabled, such as being automatically private, having restrictions on receiving messages from third parties, automatically having time limits for app usage etc. These controls can only be overridden with the consent of a parent.

At present teen accounts only apply to new teen users. However, Meta appears to be planning to roll them out to existing teen users over the coming months.

The ICO’s View

The ICO released a statement supporting the move. It said:

“We welcome Instagram’s new protections for its younger users following our engagement with them. Our Children’s code is clear that kids’ accounts must be set as ‘high privacy’ by default, unless there is a compelling reason not to do so. We’ll keep pushing where we think industry can go further, and take action where companies are not doing the right thing”.

What is the Children’s Code?

The ICO’s Children’s Code (aka the ‘Age Appropriate Design Code’) is a Code of Practice. It contains 15 standards which for-profit online services should follow.

The Code applies to ‘Information Society Services likely to be accessed by children’. An ‘Information Society Services’ (“ISS”) is “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”. This includes social media companies – including ones not based on the UK.

Under the Code, ISSs are expected to adhere to a variety of standards. This includes not using nudge techniques to encourage children to give up more data, providing parental controls and implementing high privacy by default.

Comment

Meta, and other social media outlets, have come under increasing pressure to address the negative consequences of its products. The threat of regulation is growing.

Nick Clegg, Meta’s President of Global Affairs, has previously admitted that Meta’s child safety controls were being underused by parents – perhaps because many parents do not know how to use them. The new ‘privacy by default’ design seeks to overcome these practical problems and will help reduce Meta’s legal risk.

Ian Russell, father of Molly Russell (who was found by a coroner’s court to have “died from an act of self-harm whilst suffering from depression and the negative effects of on-line content”), described the changes as a ‘turning point’. However, Mr Russell also warned that previous changes had been ineffective.

This view appears to be shared by the ICO, which welcomed the changes but stated “we think industry can go further”. The threat of regulation is not going away.

On a practical level, there is perhaps a risk that teenagers simply raise the ages of their registered accounts before the Teen Accounts are rolled out to existing users. Meta has said it is taking steps to counteract such attempts. Whether those efforts are effective will be critical to the success of this new scheme.

The Data Brief

A monthly data protection bulletin from the barristers at 5 Essex Chambers

The Data Brief is edited by Francesca Whitelaw KC, Aaron Moss and John Goss, barristers at 5 Essex Chambers, with contributions from the whole information law, data protection and AI Team.

Visit the Information Law, Data Protection and AI area

Search The Data Brief

Portfolio Builder

Select the practice areas that you would like to download or add to the portfolio

Download    Add to portfolio   
Portfolio
Title Type CV Email

Remove All

Download


Click here to share this shortlist.
(It will expire after 30 days.)