Middle schoolers are now using AI to create ‘deepfake’ pornography of their classmates

A recent news story out of Alabama should be getting far more attention than it is, because it is a glimpse into the future. Middle school students are using artificial intelligence (AI) to create pornographic images of their female classmates 

A group of mothers in Demopolis say their daughters’ pictures were used with artificial intelligence to create pornographic images of their daughters. Tiffany Cannon, Elizabeth Smith, Holston Drinkard, and Heidi Nettles said they all learned on Dec. 4 that two of their daughters’ male classmates created and shared explicit photos of their daughters. Smith said since last Monday, it has been a rollercoaster of emotions.

‘They’re scared, they’re angry, they’re embarrassed. They really feel like why did this happen to them,’ said Smith. The group of mothers said there is an active investigation with Demopolis Police. However, they wish for the school district to take action. They believe this is an instance of cyberbullying and there are state laws and policies to protect their girls.

‘We have laws in place through the Safe School’s law and the Student Bullying Prevention Act, which says that cyberbullying will not be tolerated either on or off campus,’ said Smith. ‘It takes a lot for these girls to come forward, and they did. They need to be supported for that. Not just from their parents, but from their school and their community,’ said Nettles.

The school hasn’t given many details yet, with the Demopolis City Schools Superintendent Tony Willis saying in a statement that there is little they can do: “The school can only address things that happen at school events, school campus on school time. Outside of this, it becomes a parent and police matter. We sympathize with parents and never want wrongful actions to go without consequences – our hearts and prayers go out to all the families hurt by this. That is why we have assisted the police in every step of this process.” 

We’ll be seeing a lot more of this in the years ahead, as a generation weaned on hardcore pornography is increasingly enabled by technology to create imagery of people they know personally. The rise of sexting took pornography and made it personal – educators and law enforcement are still grappling with how to curtail the nearly ubiquitous practice of sending and receiving intimate images, the majority of which are then shared with others. Many of these images, by virtue of the age of the students involved, constitute child pornography. AI-generated pornography will create a whole laundry list of other disturbing issues to deal with. 

A quick scan of recent headlines will give you a sense of where this is headed. From Fortune: “‘Nudify’ apps that use AI to undress women in photos are soaring in popularity, prompting worries about non-consensual porn.” These apps allow people to “digitally undress” people they know and thus create nonconsensual pornography of girls and women. These apps have already acquired millions of users. 

From MIT Technology Review: “A high school’s deepfake porn scandal is pushing U.S. lawmakers into action.” At a New Jersey high school, boys had used AI to “create sexually explicit and even pornographic photos of some of their classmates,” with up to 30 girls being impacted. The sense of violation felt by the victims is profound. 

From CNN: “Outcry in Spain as artificial intelligence used to create fake naked images of underage girls.” From the story: “Police in Spain have launched an investigation after images of young girls, altered with artificial intelligence to remove their clothing, were sent around a town in the south of the country. A group of mothers from Almendralejo, in the Extremadura region, reported that their daughters had received images of themselves in which they appeared to be naked.”  

One girl was blackmailed by a boy with a doctored image of herself. Another cried to her mother: “What have they done to me?” 

From the Washington Post: “AI fake nudes are booming. It’s ruining real teens’ lives.” From the story: “Artificial intelligence is fueling an unprecedented boom this year in fake pornographic images and videos. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people in photographs – analyzing what their naked bodies would look like and imposing it into an image – or seamlessly swap a face into a pornographic video.” 

Those are just a few examples of dozens of stories from the past few months. The pornography crisis is being exacerbated further by AI, once again highlighting the unfortunate truth of a joke in tech circles: first we create new technology, then we figure out how to watch porn on it. The porn industry has ruined an untold number of lives. AI porn is taking that to the next level. We should be prepared for it. 

Leave a Reply

Your email address will not be published. Required fields are marked *