A ban on social media use for under-16s has been branded “speculation” – but the government must “continue to look at” the need to protect children, a minister has said.
Science minister Andrew Griffith dismissed as “speculation” reports that social media access could be curbed for some young people as part of a “potential consultation” into the issue.
Ministers have reportedly been discussing the impact sites such as TikTok and Instagram have on young people’s wellbeing, and that under future plans, they may be forced to seek their parents’ permission before using the social networks.
Asked by Sky News whether such proposals could ever be enforceable, Mr Griffith said: “Well, we’re talking about speculation.”
He said the government had already passed the Online Safety Act that “makes sure that those activities which are illegal offline are now also illegal online”.
Politics latest: Businesses face ‘period of uncertainty’ over Home Office visa changes
However, he went on to say there were “genuine harms” on social media alongside the “good things”.
Politics latest: Businesses face ‘period of uncertainty’ over Home Office visa changes
Rishi Sunak faces prospect of by-election as MP Scott Benton recommended for 35-day suspension
Computer programmers, butchers and chefs top the workers that might be missing in 2024
“I understand as a parent myself that parents feel very strongly about the need to protect our children from some of the ills in society that have been running rampant in the past on social media,” he said.
“We’ve already taken action, and it’s right that we continue to look at that. I don’t think you’re ever going to say that job is done.
“So the speculation is about a potential consultation in the new year.”
Pushed on whether a consultation was happening, he said: “I don’t think any of us know what’s happening and I wouldn’t comment on future consultations at this point in time.”
The Online Safety Act became law in October and aims to make the UK “the safest place in the world to be online”.
Under the legislation, rules have been imposed upon firms such as Meta and Apple to ensure that they keep inappropriate and potentially dangerous content away from young and vulnerable people.
An example is material that promotes suicide or self-harm, after a coroner ruled last year that it had contributed to teenager Molly Russell taking her own life.
Be the first to get Breaking News
Install the Sky News app for free
Read more:
What is the Online Safety Bill, who is in favour, who opposes it, and how will it be enforced?
Pornography websites may have to use photo ID and credit card checks to protect children
The Act also aims to hold platforms responsible for illegal content such as child sexual abuse images, make adult websites properly enforce age limits, and stop underage children from being able to create social media accounts.
Media regulator Ofcom is responsible for enforcing the new rules, with companies liable for fines of up to £18m or 10% of their annual global turnover for non-compliance – whichever is greater.
Firms and senior managers could also be held criminally liable if found not to be doing enough to protect children, while platforms may also be completely blocked from operating in the UK in the most extreme cases.