When AI 'Nudify' Apps Target Children: What You Need to Know

New AI tools are making it disturbingly easy to turn ordinary photos into fake nude images. These so-called "nudify" apps are especially dangerous when used to target children - and the risks are escalating.
Here’s what parents, educators, and advocates need to know.
What are "nudify" apps?
Nudify apps use artificial intelligence to remove clothing from a photo - or to swap someone’s face onto a naked or sexualised image. These are sometimes referred to as nudification or deepfake tools.
They’re often easy to access, even for young people, and are being used to:
- Bully or humiliate classmates
- Blackmail or threaten children (sometimes known as sextortion)
- Create fake sexual images of minors from innocent photos
- Normalise sexualisation and abuse
Even when the image is fake, the harm is real.
Why is this so serious for children?
Children and young people are especially vulnerable to the additional effects of these tools:
- Emotional and psychological harm: Victims may feel ashamed, scared, or traumatised, leading to long-term mental health issues.
- Bullying and blackmail: AI-generated nudes are being used to threaten children, including sometimes into sending real intimate images.
- Peer misuse: Some children may use these apps on each other as a ‘joke’ - not realising it could be illegal or deeply damaging.
- Normalising abuse: When fake images of minors are passed around, it sends the message that their bodies can be manipulated and shared without consent.
As well as children’s safety, this issue is also about dignity, trust, and the emotional wellbeing of a whole generation.

Are these images illegal?
Laws vary widely across the world, and in many places they haven't caught up with the technology. But some progress is being made. For example, at the time of writing:
In the UK:
- It is already illegal to create, share or possess indecent images of children, even if they are AI-generated or digitally altered from a photo.
- New legislation (including amendments to the Crime and Policing Bill) will criminalise the development, possession, or use of the AI tools and models specifically designed to generate such material and outlaw guides on how to create AI-generated child sexual abuse material.
- The National Crime Agency and Internet Watch Foundation (IWF) have issued guidance confirming that AI-generated child sexual abuse material must be treated with the same urgency and safeguarding response as any other child sexual abuse imagery.
- Under the Online Safety Act, platforms are now under a legal duty to reduce the risk of this kind of harm and take action when it occurs.
In the US:
- The TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks Act) became federal law in May 2025.
- It makes it a federal crime to share non-consensual intimate images - including AI-generated ones - of both adults and children.
- The law also requires "covered platforms" to set up a reporting system and remove such images within 48 hours of being notified.
Around the world:
Countries such as Australia, South Korea, and Germany are also updating their laws to address this issue. But enforcement remains challenging, especially when tools are hosted across borders.
What can we do to protect children?
What can we do to protect children?
- 💡 Raise awareness
- Teach children about AI tools, consent, and image privacy.
- Help them understand the risks — and how to seek help.
- Keep conversations open so they feel safe coming to you.
- 🛠️ Tackle the tools and platforms
- Improve platform accountability: companies must invest in next-generation detection tools, as traditional systems often fail with AI-generated content.
- Equip adults
- Give parents, teachers, and youth workers clear guidance.
- Make sure safeguarding policies explicitly include synthetic sexual imagery.
- Offer training on recognising and responding to AI-related abuse.
- 🛡️ Provide Support
- Provide counselling, legal support, and fast reporting channels.
- Never allow blame or shame of a child who’s been targeted.
- Treat AI-generated abuse as seriously as any other form of child sexual abuse.
- 🧭 Push for stronger laws
- Ensure legislation bans the tools as well as the content.
- Strengthen international cooperation - especially among G7 and Five Eyes nations - to tackle cross-border crime.
If your child has been affected
If you discover your child has been targeted by one of these apps:
- Stay calm - your response will shape how safe they feel speaking to you.
- Listen without judgment - this is not their fault.
- Report it - contact the police and report the content to the platform where it appeared.
- Seek guidance on evidence - Immediately contact law enforcement and ask whether screenshotting the image or downloading it is required. This is critical because in many jurisdictions, possessing the image (even for evidence) may be a criminal offense, while in others it could be required. However you should always record the url, time and date.
- Use the Take It Down Tool - this free service from NCMEC uses digital fingerprints to help minors remove non-consensual intimate images from participating sites.
Final Word
These apps are dangerous tools that can cause serious and lasting harm. Children deserve protection from AI systems that can be weaponised against them.
At the Safe AI for Children Alliance (SAIFCA), we're calling for stronger laws, safer platforms, and better education to reduce these risks and protect young people’s wellbeing.
For immediate help and advice
- UK
- Internet Watch Foundation (IWF): Report illegal online content
- Website:www.iwf.org.uk
- CEOP (Child Exploitation and Online Protection): Make a direct report to police
- Childline: Free, confidential support for children
- Phone: 0800 1111
- Website: https://www.childline.org.uk/
- NSPCC Helpline: Advice for adults concerned about a child
- Phone: 0808 800 5000
- Website: https://www.nspcc.org.uk/
- US / International
- NCMEC CyberTipline: Report suspected child exploitation
- Website: https://report.cybertip.org/
- Take It Down Tool (NCMEC): Service to help minors remove intimate images from participating platforms
- Website: https://takeitdown.ncmec.org/