God in the machine? People use chatbots as spiritual advisers.

Can artificial intelligence pray? The technological entrepreneur Yossi Tsuria wanted to discover it.
He asked the IA chatbot to generate a prayer.
If Joe prayed for the health of his son, Mr. Tsuria asked in 2023, how should he pray? The machine replied: “Celestial father, in this difficult period, I come before you with a heavy heart.”
Why we wrote this
Could your next spiritual guide be artificial intelligence? AI offers Christians, Jews and others an alternative to priests, rabbis and other religious leaders.
But Mr. Tsuria thought of Orthodox Jews, not Catholics. He revised his question.
In a few seconds, the AI generated a new prayer: “Amen. Dear God, in this difficult period, I turn to you with a heavy heart. My beloved Perry, my youngster, faces [a] Battle … I stand by his side, feeling the weight of worry and fear. »»
It was an indication that, even when they started, AI chatbots had captured the language of at least certain major religions. And since then, people have started to turn to chatbots as therapists, spiritual advisers and even companions.
Last fall, a Catholic church in Switzerland temporarily installed an AI Jesus – a hologram led by AI – in a confessional. Religious leaders have delivered sermons written by AI. And many applications have been developed by offering advice generated by AI by prayer and meditation.
However, many researchers and religious leaders are skeptical about the depth and veracity of the religious councils of a chatbot and see limits of technology, called generative. On the one hand, many chatbots do not manage complex moral or religious problems very well. They also tend to stereotyper non -Western religious traditions.
But progress has apparently infinite possibilities to explore and practice your faith. With this, a number of questions on what the spiritual and moral dilemmas means using a machine that has no conscience.
“I am excited in some respects on this subject,” explains David Brenner, Chairman of the Board of Directors of the Organization AI and Faith, an organization that convenes discussions on AI, religion and ethics. “But I really believe that we must be careful in the way we apply it and how we can continue to bring our human understanding in the way we work best to interact with this technology.”
Many questions that the developers of the AI consider are also explored in faith, he says: “Who are we vis-à-vis animals in our creation? What is the meaning and goal of life? How do you preserve truth and justice in life? How do you preserve the agency?”
A certain number of researchers should be used correctly, large language models (LLM) as Chatgpt can be tools to answer these questions and provoke a deeper spiritual reflection. The models formed on the Torah, for example, can synthesize what the sacred Jewish text says on forgiveness. But people are divided on the way in which the insightful responses of LLM are really, as well as their appropriate uses in a religious context.
“The God in which I believe is the one who embodies truth and understanding,” explains Johnny Flynn, who has just finished his religious studies and the philosophy of the University of North Carolina in Charlotte. “If I ever try to get involved with the spiritual … then I would like to go to a source that also has an understanding and can grasp the truth.”
Although chatbots deliver words with empathy and emotion, they do not feel or do not even know what emotions are. To be angry, for example, is “to have the conviction that you were injured in an unjustified manner”, explains Alba Curry, lecturer in philosophy at the University of Leeds in the United Kingdom.
An LLM cannot make this kind of judgment. Instead, he uses prediction. Trained on more written words than a person could read in a life, chatbots use advanced technology to guess with great precision which word then comes in a sentence and craft responses that imitate the conversation. It makes them human and easy to speak.
Metaphysics and AI
This can be a problem for someone looking for real spiritual advice. “Large language models right now are sycophants. They really want to give you what you want, ”explains Dr. Curry. It is not the same thing as “strength and grain” that a priest or a rabbi could offer someone in the face of a question of religious duty, for example.
The models are not well listening to emotional vulnerabilities either. And people who worked as IA researchers and have served as spiritual advisers say that the two are not interchangeable.
Marcus Schwarting, an AI researcher pursuing his doctorate at the University of Chicago, is a Minister of Stephen Commission, a Christian layman who is a trained caregiver. As such, Mr. Schwarting met each week with a person looking for support. It offered him a way to compare how conversations could have gone with a chatbot, against himself.
“I don’t really think that a model of AI is capable of having this feeling of presence,” he says.
However, it does not exclude all ways that AI could be a useful tool for spiritual exploration. If someone speaks with an AI, he does 90% of the work, he says. “I don’t really think there is something metaphysical with the AI model, but there could be something metaphysics that happens to you.”
Other researchers say that chatbots could help someone think about how to confess a sin to a pastor or other religious authority, or could provide company while reading a sacred text.
“The Christian community could begin to realize that it is the kind of use that would be useful for our community,” said Dr. Curry of the University of Leeds. But “we will never use important language models for these really deep moral debates”.
Until now, the vast majority of material chatbots are trained is Western, which leads to a bias against religions of other parts of the world, explains Flor Plaza, professor of computer science at Leiden University in the Netherlands. LLMs show nuance when they discuss the main religions in the United States and Europe. However, in a study, she and others have found that oriental religions, such as Hinduism and Buddhism, are strongly stereotypical, and Judaism and Islam are stigmatized.
Chatbots generally encourage positive things, such as respect for various denominations, and lounge against certain ideas, such as religious violence or self -harm, such as suicide. (However, many companies have not yet developed reliable guarantees against machines suggesting the latter). Values are determined by companies that develop and form AI – which means that their workers will influence the way religion is represented.
This is one more reason to bring religious leaders into the conversation, explains Elias Kruger, a scientist that launched a blog called Theology in 2016. There is potential in the use of theological thought to explore AI from an ethical perspective, he said.
“Ethics have to do with our relationships not only with each other as human beings, but for our whole universe,” he said. “We used to treat machines and things built by man as things, and now we are going to treat them as beings.”
This change could present problems when it comes to holding faithful to the values of certain individual confessions. There is a risk of what many Abrahamic traditions would call idolatry, because AI seems to share certain attributes with the conceptions of different confessions of God, such as omniscience, omnipotence, omnipresence, says Mr. Brenner, of AI and faith. But there is a lack of others, such as love, concern, care, truth and other qualities that create “the full dimension of God”.
The AI will only change if the people who ingenuate it do it, explains Mr. Kruger. Most models are developed and held by a workforce of the Silicon Valley which biaises males. “How can we start to solve the problem, allowing people with many confessions and various horizons to become manufacturers?” said Mr. Kruger, who has a master in theology of Fuller Seminary. “I think that is really what will change the arc of AI development.”
The purpose of religion
Many religious groups are concerned about the risks of thinking that AI is omniscient. “It is made to give us an answer, whether this answer is true or not,” explains Meredith Gardner, media director for Mormon women for the ethical government. The group has signed a recent letter asking the congress to reject a proposed moratorium on AI regulation.
AI can play the role of a spiritual director by asking questions and offering prompts, explains the Reverend David Kim, CEO of Goldenwood, who has developed a bot and organizes workshops with confessional groups interested in exploring AI tools. For him, it comes back to an idea of “intelligence full of hope”. Imagination has always been a key element of his own journey of faith, he says, and he considers AI as a creative tool.
“We are certainly aware of everything that can go wrong with him, but taking into account theological commitments, we have this mandate to move forward with things that we cultivate in a very hopeful orientation,” he said.
Even if AI has no conscience, Mr. Kruger says he has no doubt that people can use it to explore their faith. But it is important to keep a sense of perspective and not to seek spiritual advice only from Chatgpt, he says.
“Religion must be on the subject, does it bring us closer or is it more isolated?”



