Artificial intelligence has given humanity a new, near-divine power. With Elon Musk’s AI tool, Grok generating sexualised images of real people, the real question, Stephen Driscoll says, is not what the technology could do, but what it reveals about our sinful nature when law and restraint fall behind

Technology has just given humanity a new, near-divine power.
Before I tell you anything about that power — what it is, how we invented it — how do you feel about the thought of human beings gaining new powers of a near-divine nature?
A lot hinges on your view of humanity. If we were all-good, these new powers would be alright. We aren’t — so you probably read that sentence with a touch of anxiety. Our view of technology is a product of our view of human nature.
It was all over the internet this week. Grok, the Elon Musk owned AI product, was producing sexually explicit images of real people. That’s the power. A user speaks (or types) a phrase, and inputs an image, and Grok gives the output. You take an innocent photo and redress the person in something sexual. Most notably, this was being used to generate images of real people in bikinis.
A near and new divine power
The power is near-divine in the way it mirrors godly creation. God creates by language as he speaks something into reality. We call it ex nihilo creation, because it’s creation from nothing. AI is the closest I think we’ve come to ex nihilo creation. The relationship between inputs (a bit of language) and outputs (a photorealistic video of a real person) can be extraordinary. We create worlds with words.
But, back to that first question, what do you expect humanity to do with this new power?
Over the last few weeks, we had a viral, deepfake, sexual-abuse, scandal. Thousands of images per hour, of real people, altered into a sexual context.
It’s worth saying that Musk’s xAI didn’t invent this new, near-divine power. Sexual deepfakes have been around for a few years. What Grok was allowing is quite conservative compared to what can be found on less reputable websites. An Australian study from a few years ago found that 98 per cent of all deepfake videos online are non-consensual pornography. 99 per cent of the victims are women (Australian e-safety report).
To take that likeness and use it as they wish. It’s the opposite of Gnosticism: stealing a person’s body without interest in their soul.
If sexual deepfakes are a few years old, what changed in January 2026? The big change was that Grok made it all easier. Grok made it free, publicly available, on a reputable website (X, formerly Twitter), through a service already used by hundreds of millions. It also gave a bit of social cover, so many people were doing it, you could frame it as a sort of joke. We had a technological innovation in sexual abuse. But other technologies have been used for the same purpose.
Technological innovation in sin
Habakkuk 2:15 reads: “Woe to him who makes his neighbors drink — you pour out your wrath and make them drunk, in order to gaze at their nakedness!”
Drink was the technology that sinful people used to make neighbours naked. It was slow, expensive, and unreliable. AI produces the same result, but faster, cheaper and more privately.
Technology, sin and law are in a sort of multi-millennia battle. Sin is like the bubbling lava beneath the surface, always searching for an opening. Law and its cousin social disapproval, try to bottle up the sin, keeping it underground. Occasionally technology makes an opening. This has happened with so many technologies. The radio, the phone, the internet, social media, AI, all so swiftly are used for sexual evil.
Unfortunately, our methods by which we try and catch child-related content rely on a registry of images and videos. AI allows people to create new images, which don’t have any registry match. We will need innovation of enforcement to catch the innovation of sin.
But the interactions are even more complex in the age of AI. AI doesn’t just enable human sin, it internalises it, mirrors it, stores it, groks it (the term grok describes the deep inner understanding a neural network gains when it successfully grasps some truth about the world).
Artifical sin
AI is trained on the fallen internet. Hence, the force of law may convince xAI to restrict these kinds of behaviours, but the issue of sexuality bubbles up from deep in the parameters of our new neural networks. Models like these train by viewing billions of images and videos from the internet and other sources. As I mentioned in my previous article, they are trained to mirror what they see. In so doing, they produce what I call ‘artificial sin’.
Even users trying not to get sexual images, sometimes do. Or cunning users prompting AI in very specific ways can unlock the sexual sin in the models.
The models know that when humans tag a photo with words like ‘beach’ or ‘summer’, sexuality is often present. They’ve seen more of Instagram than any of us. The models, in their training, waded through and internalised the immense human preference for lust. Lust has always been at the core of the internet.
Pornography is the paramount early adopter. It’s the business model that connected continents, funded satellites, gave us the technology we take for granted. And then to train our models, we send them out onto that same internet.
Our models are sexually scandalised, they’ve seen it all. No attempt to scrub the internet can make it clean enough. Our models carry, in their weights, every sexually awful thing a person can do. We try to suppress that knowledge.
Reinforcement learning is a process where companies pay people to interact with their models, to upvote or downvote responses so that a model can be optimised for a more appropriate output. But the sexually sinful knowledge is still there, in the model, just driven a bit deeper underground.
AI-generated sexual content is particularly awful, and seems to be driven not just by lust, but by humiliation. The user wants to take a real person and control their likeness. To take that likeness and use it as they wish. It’s the opposite of Gnosticism: stealing a person’s body without interest in their soul. It’s rightly described as sexual assault, but an assault that can be performed without proximity.
Technology, for a moment, creates an opening not covered by law or social pressure, and the results are appalling. All manner of evil shoots out.
Even people who aren’t directly scandalised will incur consequences of this new power. Previously you had to worry about private photos leaking, now people need to worry about innocent photos of themselves or their kids being used in sinful ways. The innocent era of social media photography, of parents proudly sharing, may be coming to an end. Sin forces us to be defensive when we want to be joyful.
But you might be wondering, what did Musk’s xAI do in response to the backlash?
First, they restricted the feature to premium users. They made it so that you have to pay money to generate sexual images of real people. The revenue on X increased by 18 per cent that day. They responded to an outbreak of awful, humiliating sexual abuse, by monetising it.
Then after global backlash, bans, and a government-led investigation, Elon Musk’s xAI announced that Grok would no longer be able to edit “images of real people into revealing clothing such as bikinis, underwear and other revealing attire” in a statement shared on X.
What does AI teach us about human nature? What does it reveal about our character? Technology, for a moment, creates an opening not covered by law or social pressure, and the results are appalling. All manner of evil shoots out. That’s us, that’s what people are like. If not lust, then pride. If not pride, then gossip.
Perhaps AI will do away with the ridiculous idea, repeated forever, that: “Man naturally is benevolent, generous, healthy-souled, but in this age is corrupted by institutions.”
The Bible says: “The heart is deceitful above all things… who can understand it?” (Jeremiah 17:9)












No comments yet