The UK government has unveiled four new laws targeting the issue of child sexual abuse images generated by artificial intelligence (AI).
Under the new regulations, it will be unlawful to possess, create, or distribute AI tools designed for child sexual abuse material (CSAM), with a penalty of up to five years in prison. Moreover, possessing AI pedophile manuals and teaching materials will be prohibited, carrying a maximum sentence of three years in prison.
Home Secretary Yvette Cooper highlighted the immense scale at which AI is exacerbating online child abuse. Cooper emphasized the need for further governmental action to address this concerning trend.
Additional legislation will criminalize websites facilitating the sharing of child sexual abuse content and offering advice on grooming children, punishable by up to 10 years behind bars. Furthermore, the Border Force will have the authority to request suspected individuals to unlock their digital devices for inspection upon entry to the UK, with a possible three-year prison penalty for severe cases of CSAM possession.
The phenomenon of artificially generated CSAM involves manipulated images, including realistic simulations using real-life children's faces and voices. These images are exploited to deceive and manipulate victims, even leading to further abuse and blackmail.
Cooper condemned the use of AI by predators to exploit and harm children, stressing the need for dynamic responses to safeguard children effectively.
In light of the recent measures, Prof. Clare McGlynn commended the government's actions while advocating for broader restrictions, particularly in banning "nudify" apps and addressing the proliferation of videos featuring adult actors in childlike scenarios, terming them as "simulated child sexual abuse videos".
The escalation in AI-generated CSAM reports illustrates the urgency to combat this disturbing trend. Experts underscore the challenge of distinguishing real from fake AI-generated content, emphasizing the necessity for robust preventive measures and the collective responsibility of tech companies to create safer online environments for children.
Barnardo's chief executive, Lynn Perry, expressed support for the government's initiative in combatting AI-produced CSAM and emphasized the critical role of legislation in keeping pace with technological advancements to protect vulnerable children.
The proposed legislative changes are set to be introduced through the Crime and Policing Bill in the upcoming parliamentary session.