A new taskforce will take “digital fingerprints” of child sexual abuse images in a bid to stop them being shared online.
Analysts will assess, hash and grade two million images from the UK government’s Child Abuse Database (CAID) by creating a unique code like a digital fingerprint.
The illegal images comprise category A and B material – the most severe images and videos of child sexual abuse.
The taskforce was set up by UK charity the Internet Watch Foundation (IWF) and is funded through a grant from international child protection organisation Thorn.
IWF will distribute the hashes to tech companies across the globe so they can be blocked or removed if users try to share the images.
The charity said that 2020 was the worst year on record for the amount of child sexual abuse material identified online and removed.
Its analysts got through 299,600 reports of potentially illegal material, a rise of 15% from the previous year.
They found that more than half (153,350) contained images or videos of children being sexually abused – up 16% from the previous year.
Susie Hargreaves, IWF chief executive, described the new taskforce as “major step forward for internet safety”.
She said: “We’ve created this world-leading taskforce of highly trained analysts to help boost the global efforts to stop the distribution of child sexual abuse imagery online.
“Not only will this absolutely vital work help to create a safer internet for us all, but it will help those victims whose sexual abuse imagery is shared time and time again, preventing their continued revictimisation and exploitation.”
Safeguarding minister Victoria Atkins added: “This government is determined to ensure that we are doing everything in our power to prevent child sexual abuse online and the innovative use of technology is central to this.
“I am pleased that Child Abuse Image Database (CAID) data is helping the IWF to carry out this valuable work towards reducing access to child sexual abuse material online and thereby preventing the revictimisation of children.”
Julie Cordua, chief executive of Thorn, said: “IWF’s work to eliminate child sexual abuse images from the internet and end the cycle of revictimisation is critical and tremendously difficult.
“We are grateful for their continued commitment to this work and are humbled to support their efforts.”