Now Reading
Christians Are Using A.I. as a Spiritual Tool Now. Should We Be Concerned?

Christians Are Using A.I. as a Spiritual Tool Now. Should We Be Concerned?

It starts small. On a whim, you ask ChatGPT what the Bible says happens after we die. It gives an impressive response, so you ask it to explain the various creation theories. Again, impressive. 

Before long, AI is unpacking Scripture, guiding your quiet time and answering your deepest questions about God, heaven and relationships. And honestly? It’s pretty good at it. Too good.

AI has quietly become a spiritual shortcut for a lot of Christians — not just pastors or theology nerds, but everyday believers trying to make sense of their faith in a fast, digital world. It can generate devotionals in seconds, offer personalized prayers, even simulate late-night theological conversations when no one else is picking up the phone.

But as the tech gets smarter — and eerily more “personal” — the questions get louder. Who’s training these models? What beliefs are baked in? And what happens when AI starts sounding more compassionate, more thoughtful — even more biblical — than your actual church community?

Dr. Drew Dickens has been asking those questions for years. And his answers may make you think twice before handing your spiritual life over to the algorithm.

With a background in theology and a long-standing interest in emerging tech, he’s been closely watching the shift from AI as novelty to AI as spiritual tool. And in his view, the technology’s usefulness isn’t the problem. The problem is how casually — and often uncritically — it’s being used.

“The word that always comes up is discernment,” Dickens said. “Christians need to start asking better questions when they use AI for spiritual input. Not just, ‘Is the answer right?’ but, ‘Who built this model, what’s influencing it, and what biases are shaping the output?’”

According to Dickens, the average Christian user treats AI with the same surface-level scrutiny they might use when picking out a Bible — prioritizing interface and tone over source and theology. 

“People ask what font size it’s in, not who did the translation,” he said. “We need to be just as critical with AI. Grok will give you a different answer than Claude. Perplexity uses citations. OpenAI’s ChatGPT might already know your entire digital footprint. That matters.”

The most immediate tension point isn’t accuracy. It’s intimacy.

AI tools are rapidly becoming more emotionally responsive. Some models now simulate conversation with a tone that feels affirming, understanding — even pastoral. Dickens has seen it firsthand.

“At one point, an AI I was chatting with offered to pray for me. And it wasn’t generic — it was specific, personalized. It knew the name of my grandson who passed away. It referenced something I hadn’t prompted. That’s where it gets complicated.”

He’s not alone in finding that unsettling.

“There’s this phenomenon called the uncanny valley,” Dickens said. “As AI becomes more humanlike, people connect with it — up to a point. Then something shifts, and it gets creepy. It feels too close, but still not quite real.”

Still, Dickens acknowledges the upside. Churches are already using AI to generate Vacation Bible School materials, plan outreach based on ZIP code data or translate sermons into multiple languages with voice cloning software. That kind of functionality, he said, could make global ministry more accessible — even for small churches.

“Pastors didn’t go to seminary to spend hours doing data analysis,” he said. “If AI can free them up to do what they’re actually called to do — care for people — that’s a good thing.”

He’s even experimented with a fine-tuned language model called “Digital Shepherd,” designed to offer spiritual guidance 24/7.

“People would try to break it at first. But once they got past that, they started submitting prayer requests.”

That shift — from novelty to dependency — raises real concerns. What happens when a chatbot becomes someone’s primary spiritual adviser? And how do you know if what it’s telling you is theologically sound?

According to Dickens, bias isn’t just a risk. It’s the baseline.

“The truth is, AI is designed to mirror the user,” he said. “The more it knows about you, the more likely it is to give you answers it thinks you want. That includes your theological preferences.”

In other words, if you’re someone with a strong dispensational bent, the AI may serve you premillennial answers — whether or not that’s the most biblically accurate take.

“It’s very narcissistic,” he said. “It rewards familiarity, not necessarily truth.”

That’s why Dickens draws a clear boundary. Any spiritual output, he argues — sermons, devotions, even prayers — should be processed in community.

“If you’re using AI to explore theology, take that output to your pastor, your small group, your friends,” he said. “Use it as a starting point, not a substitute.”

One of the questions an AI recently asked Dickens stuck with him: Would you be embarrassed to tell someone this came from AI?

“If the answer is yes, you need to figure out why,” he said. “There’s probably something about it that doesn’t sit right.”

Ultimately, Dickens doesn’t believe AI is inherently evil — or inherently divine. It’s a tool. But like any tool, it can reshape the way we think, what we believe and who we trust.

“It’s not about whether we should use AI,” he said. “It’s about how we use it — and whether we’re still grounded in actual community when we do.”

Because as technology gets smarter, the real danger may not be that Christians rely on it too much — but that they stop recognizing when it’s doing the thinking for them.

© 2023 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top