AI-Generated Indecent Images and UK Criminal Law

The use of AI tools to create indecent images of children raises difficult questions under UK criminal law. Most commercial AI platforms, such as Chat GPT or Gemini, advertise safeguards intended to prevent users from generating illegal material. However, recent controversy over Grok being used to “nudify” images illustrates how easily those protections can fail. Similar capabilities are also available through open “toolchains” – collections of software that run AI models without safeguards – allowing people to generate indecent or sexualised images that mainstream platforms would block.

This article explains how existing offences are applied to AI-generated and digitally altered images, and why fine margins in legal classification can make an outsized difference to sentencing, as well as whether an individual becomes subject to Sex Offender Register requirements.

How Existing UK Offences Apply to AI-Generated Images

Under current UK law, there is no offence which directly references AI-generated indecent images. Instead, prosecutors must place AI-created or AI-altered images into one of two existing criminal offences. In practice, this means deciding whether an image falls within the category of indecent photographs (including pseudo-photographs) of a child, or instead within the separate category of ‘prohibited images’. That distinction has real and serious practical consequences, especially on the sentencing range, and the long-term consequences for the accused.

Indecent photographs and pseudo-photographs (S.1 Protection of Children Act 1978)

Allegations involving indecent images of children are most commonly prosecuted under section 1 of the Protection of Children Act 1978. This makes it a criminal offence to make, possess, distribute, or show an indecent photograph or pseudo-photograph of a child.

Crucially, the concept within the offence of a pseudo-photograph already includes computer-generated and digitally altered images, provided they appear to depict a real child. Therefore, AI-generated or altered images can easily fall under this legislation, as long as they are realistic enough. S.1 offences are often sent to the Crown Court for sentence and routinely cross the custodial threshold. Well-prepared mitigation and planning can be the decisive factor in whether the Judge is prepared to suspend the prison sentence.

Prohibited images (Coroners and Justice Act 2009)

There is a separate and narrower criminal offence under section 62 of the Coroners and Justice Act 2009 – being in possession of a ‘prohibited image of a child’.

This provision covers images that are not photo-realistic but which are nevertheless pornographic, grossly offensive, or obscene. Typical examples include cartoons, drawings and manga-style images.

How are AI-generated images classified?

AI-generated imagery can fall into either category depending on how it appears and how it was created. Where an AI image convincingly resembles a photograph of a real child, prosecutors are likely to argue that it is a pseudo-photograph under section 1 of the 1978 Act. Where the image is clearly artificial or stylised, it is more likely to be approached as a prohibited image under the 2009 Act.

Crown Prosecution Service guidance reflects this same distinction. Prosecutors are directed to consider whether an image “appears to be” a real child and whether it was created or obtained for the purpose of sexual arousal. In practice, this inevitably involves an element of subjective judgment, which can be legally challenged. In AI-related cases, this often becomes a question of visual realism rather than the method of creation. As generative technology continues to improve, images that are entirely synthetic can nevertheless be indistinguishable from photographs, making the boundary between pseudo-photographs and prohibited images increasingly difficult to draw.

Why does classification matter?

Offences involving indecent photographs or pseudo-photographs are treated by the courts extremely seriously and frequently attract prison sentences. Prohibited-image offences, while still serious, occupy a lower tier of offending and are often sentenced on a very different basis.

For example, the Sentencing Council guideline for possession offences places a single Category A indecent image at a starting point of 12 months’ custody, even for a first-time offender. By contrast, possession of a single prohibited image does not automatically cross the custodial threshold and may, depending on the circumstances, be dealt with by way of a community order.

There is also an important impact on Sex Offender Registration requirements. Unlike a section 1 offence, a person convicted of possession of a prohibited image is not always made subject to notification requirements, and this will depend on the sentence imposed.

Upcoming legislation targeting AI tools

Under proposals announced in early 2025, it will become a criminal offence to possess, create, or distribute AI tools that are designed for the purpose of generating child sexual abuse material. The proposed maximum sentence for this offence is five years’ imprisonment.

A related offence will criminalise possession of materials that provide instructions on how to use AI to create or share child sexual abuse material, sometimes referred to as “AI paedophile manuals”. This offence is expected to carry a maximum sentence of three years.

Defence considerations

AI-related indecent image cases raise a key issue – who is treated as having ‘made’ the image?

In traditional indecent image law, a person will normally be regarded as having made an image if they intentionally cause a computer or device to create it. The involvement of AI does not automatically change that principle.

However, questions can arise about whether the user of an AI tool genuinely caused the creation of an indecent image, particularly where prompts were vague or non-sexual and outputs were unexpected. The prosecution must still prove that the defendant’s actions were a real and operative cause of the image being produced, and that they knew the nature of what was being generated. Where an individual did not realise an indecent image had been created at all, this may undermine the prosecution case on knowledge and intention. Even if the image was generated in entirely blameless circumstances, it is important to note that an offence is still committed if the image is knowingly retained.

Many of the issues in this area are finely balanced. Given the serious consequences of facing prosecution, receiving expert advice is critical and can make a decisive difference to the outcome.

See our Indecent Images Solicitors service page for more information.

If you are under investigation or facing charges relating to AI-generated indecent images, contact our specialist indecent images solicitors for a confidential initial consultation via on online form, or on 0333 240 7373 or email [email protected]

 

Nathan Seymour-Hyde

Partner & Solicitor

Meet the team