Last year, Samuel D. James – author of an essential book on the internet age titled Digital Liturgies – warned that “Christians are not ready for the age of ‘Adult AI.’”
In his essay, James made the case that the rise of AI would herald a shift in the porn industry similar to the shift from print magazines and video cassettes to near-omnipresent internet pornography. This, James warned, would render many of the most potent anti-porn arguments made by Christians, such as the harm pornography incurs against those directly involved, moot.
“For users, AI simply represents a limitless scope of what you can make digital performers do,” James told me. “Before the web, the vast majority of pornography fit nearly in either the ‘softcore’ or ‘hardcore’ varieties. The web pretty much destroyed that distinction by offering users whatever they want at one location, and how when we say that the average 10-year-old has been already exposed to pornography, we don’t mean Playboy; we mean things much more intense and explicit. AI porn will, I think, likewise shift the window, so that the baseline of content becomes much wilder, much more exotic.”
I think James is right, although my involvement in the anti-porn world over the past decade leaves me unconvinced that the primary arguments being used against porn will be ineffective in the age of AI. The argument that pornography destroys porn stars has been long replaced by the argument that pornography rewires the minds of the user – and that millions of those users are now children. At major anti-porn conferences, few are making the argument that pornography must be addressed because porn performers are harmed (although they are).
But it is indisputably true that the rise of AI will hugely exacerbate the porn crisis. Girls are already being viciously bullied by AI porn apps, where boys can create deep fake pornography of their peers. This has already resulted in a string of suicides, as AI meets revenge porn. There are apps that can digitally “undress” people that have been used against high school girls with traumatizing effects, and authorities – as usual – have been caught completely off-guard by the latest pornographic technological development. Just this month, Michigan passed two laws banning AI-generated pornography.
But in many cases, the law has not yet caught up with technology, as highlighted by a Fox News report on September 1. An Indianapolis judge was forced to find an “AI child pornography defendant” not guilty due to the uncertainty of the law surrounding AI porn. An unnamed man was charged with ten counts of child exploitation and child pornography for “superimposing actual headshot photographs of juvenile girls onto … AI-created images of nude women,” but to the clear frustration of the court, the law in this regard is still a grey area.
“Your attorney summarized nicely what I’m sure a lot of people think about what you’ve done,” the judge told the defendant. “It should be against the law, and I imagine at some point it will be, but I don’t think it is yet, at least not what you did in this particular instance. I don’t buy into the idea of that you can pin somebody else’s body onto a child and say that’s not them. The child doesn’t have to exist for the statute to apply.”
“When the allegation is that you are creating pornography or you are creating child pornography where you’re taking images or AI images of kids … those individuals are still dangerous to our community,” Marion Country Prosecutor Ryan Mears told FOX. “Our position certainly, when you have real live identifiable children, the position of the Marion County Prosecutors’ Office is that it’s absolutely something that should be and will be prosecuted.”
State Senator Aaron Freeman, the Republican chairman of the Senate Corrections and Criminal Law Committee, stated that he hopes the ruling will be overturned, and that “lawmakers should reexamine the statue as it applies to AI-generated child pornography.”
“Indiana has updated our law and accommodates the law for these new and AI images,” he said. “The law does not and should distinguish between a live human and an AI-generated human. The point is, this is the image of some child, whoever that child is, with sexual contact occurring to a child. We will work to get to a solution there, whether it’s through the Indiana Attorney General, that’s through the court system, the appellate process, or whether that’s us tweaking the law next session, this will be done and we’ll figure it out and we’ll get to the bottom of it and hold somebody accountable.”
Two things are true here: lawmakers have much to do to catch up with the rise of AI, and in the absence of broader legislation regulating AI, that will remain the case for years to come. With that said, I do not think that the rise of AI pornography will complicate the fundamental premises of the anti-porn argument being made, which is that pornography is rewiring the upcoming generation with deeply horrifying effects. In fact, the rise of AI pornography is, unfortunately, likely to strengthen rather than weaken these arguments as the harm becomes more insidious and deepfake porn becomes ubiquitous.