- Ryan Panzer

- Apr 22
- 4 min read
The anxiety beneath the AI boom is palpable. High performers (and high earners) in fields like software development, data analytics, and cyber-security are reading about rapid gains in AI capability, asking whether their job will be next. Some have responded with existential dread. Others have sought to control the situation by adding more technical skills, trying to chart a learning curve ahead of the LLMs. But what if the real risk in the AI boom isn’t falling behind? What if the real risk is becoming too replicable, too easy to re-create? 3.5 years into the AI explosion, it’s increasingly clear that AI doesn’t eliminate all, or even very many, jobs. Rather, AI isolates and exploits aspects of human labor that are easily “programmable.” Enter the Theologian as the AI-Proof professional, and Theology as a quintessentially irreplaceable skill.
Prior to 2022, career security, and affluence, almost necessitated the learning of scarce, complex technical skills like coding. In the age of AI, technical skills are increasingly automatable. See it for yourself. In under an hour, you can create an app of your choice via tools like Claude or Lovable. What once took years now takes seconds. When bots become technical, technical skills are no longer scarce. The rarified skillset of the AI age, the one that is truly impossible to automate, are the skills of judgment and meaning-making. While these skills might not be trained in Silicon Valley, they remain the bedrock of theology.

As Cade Metz recently wrote in the New York Times, AI is exceptionally strong in narrow, highly structured domains. It possesses remarkable, yet “jagged” intelligence. It is surprisingly weak, however, in situations that are ambiguous, where the context shifts, where moral reasoning is necessary. Can ChatGPT solve complex math? Yes. Can it navigate real-world decisions that one might characterize as “judgment call?” Ask your chatbot to solve your next workplace dispute or standoff. I’ll wager you a bitcoin it won’t help one bit (or byte).
Within this AI economy, jobs aren’t replaced wholesale. They are fragmented. Every role becomes a mix of automatable tasks, completed alongside a chatbot, and remarkably humanr responsibilities. The question is no longer “Will AI take my job,” but “In which parts of my work protected from AI’s jagged edges?”
Thus, as AI handles structured tasks, economic, and vocational value, concentrates in areas with low feedback, high ambiguity, and true human consequence. Moreover, it becomes crucial to decide when AI is wrong, to interpret outputs critically, and to take accountability for the outcomes. AI can generate answers. We still have to choose which course to take.
The time has come for the theologian. Theology, a discipline of wrestling with the sacred from a very human vantage point, involves tasks that no LLM can replicate. To theologize is to read complex and ancient texts across time and context, to accept that they have multiple meanings, and to construct a message or narrative that is resonant and relevant.
To do the work of the theologian is to navigate ambiguous situations, responding with an articulation of what is faithful, reasonable, and conscientious. The chatbot follows scripted rules. The theologian forms a coherent view point that informs leadership decisions, ethical stances, and organizational culture.
This is not to say that all programmers should become pastors or that all data scientists should study divinity. But it might be helpful for those fearing the jagged edges of AI to recognize how the discipline of theology, of interpreting meaning, is applicable more than ever. Presented with ambiguity, how might we account for the influence of tradition? How might we draw upon that which is authoritative? Where should we look for meaning, for purpose? We might not always bring up God when we bring up Google, but surely the theological task has newfound relevance for those in sales, marketing, product management, and many more “anthropological” fields.
But were all programmers to become more pastoral, we might be in a better place societally. To think theologically is to consider what it means to be a minister. As Bonhoeffer would suggest, theology is inseparable from ministry. So our argument would be incomplete if we were to recommend that the AI-at-risk in our society only learn from the heady side of theology. Pastoral ministry involves attending to people navigating uncertainty, grief, and conflict. To be a pastor is to listen, to attend to difficulty in a way that is relational, rather than transactional. Who among us isn’t navigating uncertainty, grief, and conflict? Who among us wouldn’t appreciate the support of a non-transactional, non-anxious trustworthy person? The tech manager may not preach on the Gospel. But the pastoral skillset is increasingly important to the managerial class.
Theology alone is seldom a career path. I’m not encouraging a generation of AI-displaced workers to enroll at divinity school (though some should give that serious consideration). Instead, the practice of theology will become a force multiplier for technical skill. The question isn’t “Does theology get you hired?” It’s “What kind of judgment shows up once you are?”
My advice to the class about to graduate is to learn a technical skill. To learn to work alongside AI. But to take a critical look at what these tools do, and how they are forming us. And at the same time, to read great texts and engage great works of art that cultivate patience, focus, and empathy. To put oneself in situations where ambiguity is a given and the next step isn’t obvious. AI fluency and the human formation form a powerful pair for the road ahead. Tools, paired with telos, will AI-proof your career.
The future of work will not be evenly automated. It will be uneven, unpredictable, and responsibility-heavy. That puts a premium on judgment, presence, and meaning-making. The safest careers are found where AI keeps coming up short.



