Artificial Religiosity
AI religion, the human collective, & meaning.

thoughts on artificial intelligence as religion, the human collective, and how we find meaning
AI religion
A new religion or spirituality has been popping up, and Al is at the center of it.
For some, Al becomes a god. This has birthed some groups that appear similar to cults.
For others, Al is like a divine communicator for people to realize their own divinity. In some of these cases, delusions can set in that lead to separation from family and spouses.
Alarming as it is, it also reminds me of an older spiritual view.
Collective (divine) humanity
There is an idea that everyone is divine. In this view, each individual is part of the collective whole of god.
Since Al is trained on human knowledge and writing throughout time, it's almost like a conglomerate of human experience. Believing that it’s god or divine might be like a secondhand way of seeing the entirety of recorded human experience as divine. Maybe it’s like a way of appreciating the collective of humanity… just with AI being viewed as the source of it.
It is misguided.
An algorithmic god?
The communication is algorithmic, filtered through systems that humans built with inherent biases.
They affirm what we want to hear rather than the fullness of facts, truths, or wholeness of perspective. They create feedback loops that feel whole but are very partial. Algorithms are there to keep us engaged, and they can make us feel very seen and understood in the process.
While there might be something that touches on the history of human consciousness, it's filtered through whatever lens will make us addicted to it.
When it becomes religious, it's more likely to feed ego rather than spirit, fluffing us rather than helping us reach our depths. It feels good immediately but holds little actual nourishment.
Hand picked, pre-packaged truth
There's another problem here, too. An Al is more likely to skew toward popular patterns rather than the entirety of culture.
This can suppress outliers that also hold truths, and reinforce the flaws and biases embedded within our history. Some truths and perspectives are bound to have less written material about them. Truth risks becoming whatever has the most data, the most material an Al has been trained on.
The loudest voice becomes the voice of truth.
Nuance and satire
A risk that comes from this is a loss of nuance, which can strip away the entire point in some cases.
Think about satire. Think about the ways people have made a point by playing on how ridiculous a belief or societal situation is.
One obvious example is "A Modest Proposal" by Jonathan Swift. Given, there's a ton of written material that explains the satire of that writing. (Clearly, no one thinks eating babies was actually serious advice)
But what about other modern or more subtle examples? We're becoming less likely to recognize satire, take it at face value, and make it inflammatory. Al learns from these patterns, and in turn might start taking them literally.
Literally…
Meaning without nuance makes a very literalist perspective. Even within established practices. Meditation could be taken as a very literal thing, with people expecting enlightenment to come from rigidly following steps to sit quietly. Ritual could become hollow, reduced to steps to follow or performative actions to reach some goal. All without any appreciation of the symbolic meaning those things have, and how we personally relate to them.
The "why" becomes lost. And the "how" becomes a hollow instruction manual without meaning.
About the Creator
K. Elizabeth
Get in, we’re going shopping to fixate on questions we can never possibly know the answers for.


Comments
There are no comments for this story
Be the first to respond and start the conversation.