Lying about how old you are is a childhood rite of passage, but in the social media era it comes around earlier than ever.
The minimum age for most social media users is 13. But for younger children, registering an account on a smartphone is as simple as adjusting the year of birth and pressing “okay”.
With no attempt at age verification, that’s a far less nerve-wracking deception than fibbing to an usher to get into a 15-rated movie.
And judging by the Information Commissioner’s Office (ICO) finding against TikTok, it comes with almost zero chance of detection.
Abuse of terms and conditions
The ICO found that around 1.4 million under-13s in the UK are routinely using the platform, and that TikTok was insufficiently concerned at this industrial abuse of its terms and conditions.
The company “failed to carry out adequate checks to identify and remove underage children from its platform”, the ICO found, and as a consequence failed to get parental consent to use their data – a legal condition for using the personal information of under-13s.
TikTok fined £12.7m for misusing children’s data
TikTok filmed on fire engine in Scarborough leads to arrest of two men
Utah is first US state to require parental consent for under-18s using social media
That in turn raised the possibility that under-13s had been tracked and profiled, and potentially delivered “harmful, inappropriate content”.
These findings may have come as more of a surprise to parents than to their children, among whom TikTok remains a sensation but has long ceased to be just a cheerful forum for cute dance moves.
Be the first to get Breaking News
Install the Sky News app for free
Harmful content
The potentially harmful content to which the ICO refers will have been generated by the TikTok algorithm, meaning anyone aged 13 and over may see it too, but without any risk of sanction.
There are concerns because the TikTok algorithm is particularly effective at delivering more of what users’ behaviour suggests they want, whether it’s good for them or not.
The National Society for the Prevention of Cruelty to Children (NSPCC) said: “Because TikTok uses algorithms to show users new content, it’s easy for young people to come across inappropriate or upsetting videos.”
TikTok contests the findings and says it “invests heavily” to police its age restrictions, but the ICO judgment addresses one of the central concerns about social media; that a combination of its inherent form and specific content is harmful to mental health.
The same concerns have been raised about other platforms, which have faced similar questions over the use and retention of user data and the monitoring of content.
How TikTok’s ownership plays a role
What makes TikTok different is its ownership. The first non-American social media behemoth happens to be controlled by a Chinese company, ByteDance, and that’s put it in the crosshairs of Western governments as well as regulators.
Hours before the ICO published its findings, Australia became the latest state to ban the TikTok app from government devices, joining the United States, Canada, European Union and the UK.
These governments contend that allowing TikTok to “scrape” data from government devices – a process for which users have to give permission – poses a security risk because it could end up in the hands of the Chinese authorities.
Please use Chrome browser for a more accessible video player
In the US it has become a corporate frontline for rising tension with Beijing.
A congressional committee last month queued up to hammer its chief executive Shou Zi Chew, who denied being subject to state influence and said the data of its estimated 150 million American users will move to US servers within a US company.
That is unlikely to end the concern about national security or personal safety, but calls for an outright ban are not straightforward.
Millions of users, young and not so young, use and enjoy TikTok by choice every day. Banning a platform will not come without protest, even if others would surely fill the scrolling space.
And those users include at least one cabinet member, Grant Shapps, the Ministry of Defence, and Number 10 Downing Street, all of which have active TikTok accounts – suggesting they value access to an audience they doubt is safe.