Children as young as 12 have had their faces superimposed onto the bodies of naked girls or were ‘undressed’ by people using deepfake-generating apps.
The apps use AI-powered image generation to analyse a person’s photo and then create a realistic, fake naked image by digitally reconstructing what the body might look like under the clothes.
Deborah Vassallo, coordinator of the Foundation for Social Welfare Services’ Be Smart Online internet safety project, said the organisation had received the first report of deepfake naked images six months ago. There have been three other cases since.
“In one case, a photo of a girl addressing her schoolmates at an assembly was used to create a naked image of her,” Vassallo said.
The youngest victim was 12 years old and another was also not even 16. One other girl was aged between 16 and 18 and another was 20. All cases except the last one were reported by the parents of the children involved, with the 20-year-old coming forward herself. All school-age victims attended different State and Church schools and were from various localities.
“The images were either shared on messaging apps, such as WhatsApp, to humiliate the victim or sent to the victim herself by a fake profile in an attempt to blackmail her,” Vassallo said.
She added that the motive behind the creation and spreading of such photos was often as an act of revenge following the end of a relationship or a rejection.
Parents, she noted, faced a dilemma when considering whether to report the images to the authorities.
“If parents have evidence, we advise them to go the police but they are often reluctant to make a report because of the systemic delays in the justice system and the fact they would not want to put their children through the trauma of being asked uncomfortable questions by the police.
“Sometimes, they would not even want to involve the school for the same reason. In a sense, the children are victims twice over – victims of the crime and victims of the system,” she said.
There is also a danger of the images making their way to internet forums frequented by paedophiles. Vassallo said she believed that people caught sharing AI-generated naked images should be charged under the same laws that criminalise the possession and distribution of child sexual abuse material.
When it comes to discussing deepfake naked images and other types of online abuse with children, Be Smart Online advocates awareness and education.
“We cannot stop children using technology. If we try and restrict access too much, they will never learn how to use it responsibly and be careful what they share,” Vassallo said, adding that concepts like consent and respect were also taught.
A similar case in Spain last year underscored the growing risks posed by AI image manipulation. In the town of Almendralejo, more than 20 teenage girls discovered that their photos had been altered using an app that generates fake naked images.
The pictures were generated and circulated by classmates, some of whom were as young as 14. Several victims only found out after peers began making comments or sharing the images on messaging apps.
The case sparked national debate, with parents, educators and politicians calling for urgent legal reform. Although the Spanish police opened an investigation, the fact that many of those involved were under the age of criminal responsibility highlighted the gap between legislation in force and the rapid rise of such technology.