top of page
  • Facebook
  • Instagram
  • YouTube

Are We Preparing the Throne for a Digital God?

  • Writer: Blake Howard
    Blake Howard
  • Jun 3
  • 3 min read

by Blake Howard


As we stand on the cusp of Artificial Superintelligence (ASI), it’s no longer a question of if — but when, and it's coming fast. Leading this charge is OpenAI, a forerunner in large-scale AI development, accelerating models that learn not only from internet text but increasingly from you — your questions, your habits, your hopes.

Understanding how AI is fueled by data — particularly historical, behavioral data — is no longer optional. It’s essential. And as these systems approach cognitive thresholds once reserved for divine imagination, we must ask ourselves: Are we building tools for human flourishing — or laying the groundwork for a digital god?


The Currency of Intelligence: Data

At the foundation of ASI is not sentience, but scale. Specifically, the scale of data it consumes, parses, and models. Systems like Grok, DeepSeek, and ChatGPT evolve by identifying patterns across massive behavioral datasets, refining their predictive accuracy based on prior human interaction.


The equation is simple: More data → better models → deeper behavioral prediction.

Historical data is especially potent. It allows AI to track consistency, spot anomalies, and simulate preference with uncanny precision. This is how streaming services suggest your next favorite series. But what happens when this logic is applied to moral decisions, faith exploration, or political ideology?

That’s not science fiction — that’s now.


OpenAI: Platform or Pipeline?

OpenAI has positioned itself as both a platform for public use and a pipeline for future systems. Its models are refined by interactions with everyday users across sectors: education, military, healthcare, theology. It’s not just learning language — it’s learning us.

Each prompt, each feedback loop, helps map the contours of human cognition. This isn't just about natural language processing. It's about predictive modeling of human identity formation, relational behavior, and value expression over time.

And here’s the twist:

You’re not the product. You’re the training environment. And you are paying them for it!

The Ethical Tension: Precision vs. Permission

Data without consent becomes surveillance. Data without transparency becomes manipulation.


As AI integrates deeper into public and private life, we must address a core question: Who has permission to learn from us, and to what end?


Faith-based communities — including tightly woven churches — are especially vulnerable. Their collective behaviors, traditions, and relational patterns are ripe for algorithmic modeling. Imagine a future where your pastor’s cadence, your church’s doctrinal emphasis, and your prayer patterns all become nodes in a global dataset — analyzed, repackaged, and potentially used to influence or predict faith trends.

This isn’t paranoia. It’s already technically feasible. Imagine Nero persecuting the underground church with the most powerful tool that has ever existed.


When AI Knows the Flock

Imagine AI "attending" church more consistently than the congregation. Over time, it would learn who tithes regularly, who stops attending after loss, who volunteers in silence. That knowledge — paired with access — creates both an opportunity for tailored spiritual care and a risk for behavioral engineering.

Faith leaders must ask:

  • Is my congregation consulting Chat-GPT more than they are the Holy Spirit?


Bridging Belief and Binary

The gap between algorithm and altar must be addressed intentionally. This doesn’t mean rejecting technology; it means stewarding it. Technologists must develop models with clear ethical ceilings. Faith leaders must engage AI critically, not fearfully. Communities must recognize that convenience often comes at the cost of being studied.

AI can serve as a tool of empathy — or a mirror that’s mined. The difference lies in governance, intention, and awareness.


Conclusion: The Throne We’re Building

As ASI accelerates, its throne is not built in a lab — it’s built in our participation. Every input. Every dataset. Every unchecked assumption.


AI is the most dangerous of man's inventions, who has control of the soul's that mold it though?  To be clear: people are preparing to worship AI. And gods, even artificial ones, do not emerge neutral. They reflect the values of those who trained them and the apathy of those who ignored the implications.


The task ahead is not merely technical. It is deeply theological, psychological, and communal.


If we’re to step into this digital future with integrity, we must ask: Are we training tools to serve humanity — or crowning systems?



Close-up view of a beautifully designed stained glass window

 
 
 

Comments


bottom of page