Britain is set to become the first country to introduce laws against the use of AI tools to generate sexual abuse images. This comes after law enforcement agencies warned of an alarming proliferation in such cases of the technology.
It has been a major concern for the UK police, and the government has taken the decision to close the legal loophole.
Those found guilty will face up to five years in prison.
Also read: Meta served deadline by EU to come up with answers regarding child abuse material on Instagram
What does the law include?
Under the new law, it will be illegal to possess, create or distribute AI tools designed to generate child sexual abuse material.
Moreover, it will also be illegal for anyone to possess manuals that teach potential offenders how to use AI tools to either make abusive imagery or to help them abuse children, and the person found guilty will face up to three years in prison.
Also read: 'Minors burnt, hung upside down': Orphanage in India's Madhya Pradesh under scanner for child abuse
The law will target people who run or moderate websites designed to share images or advice with other offenders.
The Border Force will be granted some extra powers, through which they will be able to compel anyone who they suspect of posing a sexual risk to children to unlock their digital devices for inspection.
This comes after continuous warnings over the use of AI tools in creating child sexual abuse imagery, as its cases have more than quadrupled in the space of a year.
According to the Internet Watch Foundation (IWF), there were 245 confirmed reports of AI-generated child sexual abuse images last year, up from 51 in 2023.
Also read: Australia fines Musk’s X $385,000 for failure to provide information on child abuse content
In a 30-day period last year, it found 3,512 AI images on a single dark website.
Moreover, in these, the voices of real children and victims are also being used.
These AI tools are also helping the perpetrators disguise their identity to help them groom and abuse their victims.
Senior police figures said that there is now well-established evidence that those who view such images are likely to go on to abuse children in person.
They are further worried that the use of AI imagery could also normalise the sexual abuse of children in the country.
The bill, however, has not yet been tabled in the parliament.
(With inputs from agencies)