A report by the Australian eSafety commissioner has found children are bypassing inadequate and poorly enforced minimum age rules on social media platforms, only asking people to self-declare their age at sign-up.
The Basic Online Safety Expectations (known as ‘the Expectations’), from the online safety regulator, aims to help keep young Australians safe while using social media, messaging and gaming services, as well as other apps and websites.
The report brings together findings from a national survey on social media use among Australian children aged 8-15 and data from social media platforms – YouTube, TikTok, Snapchat and Instagram- on how they enforce their own age restrictions.
According to the online safety regulator, from responses from YouTube, Facebook, Instagram, TikTok, Snap, Reddit, Discord and Twitch, from January and July 2024, creating an account as someone aged under 13 was a simple process, with many only requiring a self-declaration of age at sign up.

eSafety commissioner Julie Inman Grant
‘Likely underestimation of the true numbers’
eSafety commissioner Julie Inman Grant said that the report shows there is still significant work to be done by any social media platforms relying on truthful self-declaration to determine age with enforcement of the Government’s minimum age legislation on the horizon.
“Social media services not only need to make it harder for underage users to sign up to their services in the first place, but also make sure that users who are old enough to be on the service, but are not yet adults, have strong safety measures in place by default.
“Few have any real stringent measures in place to determine age accurately at the point of sign-up so there’s nothing stopping a 14-year-old for instance entering a false age or date of birth and setting up an unrestricted adult account that doesn’t carry those extra safety features.
“And this likely means the platforms are unaware of the true numbers of users identified as children and teens on their services. Some platforms also make it very difficult to report under-aged users who are on their platforms today.”
She noted that reported numbers of monthly active users by services under the age of 18 are likely to underestimate the true numbers.
“Even with the likely underestimation of the true numbers, we are still talking about a lot of kids. For instance, Snapchat says of its 8.3 million monthly active users in Australia almost 440,000 are aged 13-15, Instagram with around 19 million users says around 350,000 are in this age group, YouTube with well over 25 million users said 325,000 were aged 13-15, while TikTok with close to 10 million users reported around 200,000 were in this early teen cohort.”

TikTok
In response, a TikTok spokesperson said: “As a platform, the safety of our more than 9.5 million Australian users is our highest priority and we are pleased that eSafety has recognised the best practice work we do to keep young people safe.
“Since the start of 2023, our industry leading, proactive age detection tools, have resulted in the removal of more than one million Australian users suspected of being under the age of 13.
“This report again shines a spotlight on the Government’s decision to give an exclusive carve out to the most popular social media platform for young Australians from the under 16 ban. Australian parents and guardians have a right to know what evidence, if any, supports the Government’s decision, so they can have confidence their children are safe on any exempted platforms,” they added.

Meta fact checkers
Meta said in a statement: “Given children today have access to tablets and smartphones from a very young age, we recognise the importance of providing age-appropriate experiences. Understanding age is a complex challenge, however we continue to invest in AI and other technologies to address this issue.
“We believe the simplest and most effective way to understand a user’s real age is to require age verification at the OS/app store level once at the time of download, which can then send an age signal to social media apps. Research conducted last year showed most Australian parents supported this approach.”
Meanwhile, in regard to YouTube, the tech giant added: “The eSafety Commissioner’s own research indicates that the most popular social media service for U13s and U16s is YouTube. Given the government’s stated goal to ensure young Australians gain and grow from real experiences with real people, it would make sense that the recently legislated ban for under 16s to have a social media account also apply to YouTube when it is implemented.”
Inman Grant added there will be continued consultations with industry and stakeholders throughout the year on what reasonable steps platforms should expect to enforce minimum age requirements by the end of the year.