Skip to main content
Back to Learn
Children
TikTok
social media
ICO

Is TikTok safe for children under 13? What UK parents need to know

TikTok's terms of service set the minimum age at 13 — but the platform's own research has shown millions of under-13s use it daily in the UK. Here's what the ICO has found, and what parents should do.

8 min read
15 March 2025

TikTok is now the most-used social media platform among UK children aged 8–12, despite having a minimum age of 13. Understanding what that means for your child — and what you can actually do about it — requires cutting through a lot of noise.

What TikTok's terms actually say

TikTok's Terms of Service prohibit users under 13 from creating accounts. Users aged 13–15 are placed in a restricted mode that limits direct messaging, live streaming, and who can interact with their content. However, age verification on TikTok consists of a simple self-declaration — a child can bypass it in seconds.

The age verification problem

The Online Safety Act 2023 requires platforms to implement 'robust' age verification for children. As of early 2025, Ofcom is still consulting on what robust means in practice. TikTok's current self-declaration approach is widely acknowledged — including by TikTok — to be insufficient.

Ofcom's 2023 survey found that 44% of 8–11 year olds have a social media profile despite platforms requiring users to be 13. On TikTok specifically, 1 in 3 children aged 8–11 report using it.

What the ICO has ruled about children's data

In 2023, the ICO fined TikTok £12.7 million for illegally processing the data of approximately 1.4 million UK children under 13 without parental consent. The ruling confirmed that TikTok knew — or should have known — that children were using the platform and failed to take adequate steps to prevent it.

The content risk: what kids are actually seeing

TikTok's algorithm is exceptionally good at serving content that keeps users engaged — which for vulnerable young users can mean escalating exposure to content about body image, eating disorders, self-harm, and extreme viewpoints. A 2023 investigation by the Centre for Countering Digital Hate found that a new account showing interest in weight-related content could be served eating disorder content within 8 minutes.

  • Algorithmic serving of harmful content is faster and more targeted than most parents realise
  • TikTok's restricted mode for under-15s does not prevent all harmful content
  • Duet and Stitch features can expose children to adult creators without warning
  • Comments sections on children's videos can attract adult attention

How to have the conversation with your child

Banning TikTok outright rarely works — children simply use it on a friend's device. The more effective approach is an honest, curiosity-led conversation about what they watch, who they follow, and how it makes them feel. Our Child Guide includes word-for-word scripts for this exact conversation.

Safer alternatives if they're not ready

For children under 13, YouTube Kids (with parental controls enabled), Scratch, and BBC iPlayer offer creative, age-appropriate alternatives. The key is not to present these as punishments but as stepping stones — 'let's start here and review it when you're 13'.

Go deeper

Growing Up Digital: Child Guide

Our full guide covers everything in this article — and much more.

View guide

Related articles

All ages
screen time
guidance
How much screen time is too much? The UK guidance explained
Most parents have heard of the two-hour daily limit — but UK health guidance goes much further than that. Here's what Ofcom and the NHS actually recommend, and why the quality of screen time matters as much as the quantity.
7 min read
1 March 2025
Read article