Understanding how something works requires more than personal experience. It requires 2,500 years of serious reflection.
How does one determine what to be concerned about in a complex age seemingly dominated by all things Artificial Intelligence? The “truth” is hard to discern for many when it comes to the topic of AI. Understanding what is really real has become muddied almost beyond clarification. Striving to mediate the good, bad, and ugly about algorithmic issues is the paramount intellectual and practical issue of the day. In what way and to what degree should you engage these emerging digital technologies is a legitimate question.
This is why, to my puzzlement, I read a recent piece in Religion and Liberty Online that, while trying to sound nuanced, comes off as a polemic against a perceived Luddism among many fellow Christians regarding AI.
The question at the center of Isaac Willour’s essay is deceptively simple: Who gets to judge a tool? His answer is that only people who use the tool can give advice about it. He ends his article this way:
When it comes to the responsible use of tools, take advice from the people who use them, not the people who think not using tools is an inherent indicator of virtue. They’re different presuppositions, and choosing the former is how you rebuild a society that knows the nature of a tool by its telos. And that includes a tool like AI. So let’s take our analysis from people who are adept at using the tool, not those who aren’t because they’re already convinced they’re against it—whatever it is …
There is something worth taking seriously in this concern, because rhetorical propaganda does dominate both sides of this debate, and reflexive technophobia is its own failure of discernment. To crib off Aristotle, practical wisdom lives between the two extremes, and striving to discern the truth of the matter should be important to all thinking people, especially Christians. But Willour’s remedy—trust the tool-users—is precisely where a 2,500-year tradition of serious thought pushes back, and it does so from the very beginning.
Plato, in the Phaedrus, quotes a mythical Egyptian king as saying the inventor or creator of a thing is blinded by paternal affection for it, seeing only its promise and not its dangers, and that the judgment of whether an invention is beneficial or harmful belongs not to its creator but to others. Plato also famously argued in the same work that the invention of writing would harm memory and recollection. The Romans in turn wrestled with what the ordered day would do to their culture when the sundial and other timekeeping devices spread through the empire from 200 to 100 BCE. The instinct to question what a new tool does to the people who use it is as old as recorded thought itself.
No serious person should distill the streams of technological critique down to simply a question of how one uses a given tool, because tools are never neutral in the way that framing implies. As Martin Heidegger explains in his classic essay on this topic, technology is a way of revealing the world, of ordering and framing reality, not merely a collection of instruments. Neil Postman sharpened this insight further when he wrote that embedded in every tool is an ideological bias, a predisposition to construct the world as one thing rather than another. This is not a fringe position but the bedrock of serious philosophical engagement with the question.
That engagement has a deep and varied Christian tradition behind it, one that Willour’s piece curiously flattens. Catholics like Walter Ong, Marshall McLuhan, Romano Guardini, and Albert Borgmann; Protestants like George Grant, Jacques Ellul, C.S. Lewis, and Egbert Schuurman; and religiously adjacent thinkers like Neil Postman and Harold Innis have all contributed to a rich conversation about how man uses and is used by his technologies, and none of them were Luddites or people who refused to engage cultural and social uses of technology. They were serious, historically informed thinkers doing exactly what Willour dismisses: critiquing tools they understood deeply from a position of philosophical reflection, not fear or obscurantism.
As I have researched and taught on this topic over the past 20 years, one of the most consistent threads running through the history of technology is Roy Amara’s deceptively simple dictum that we tend to overestimate the effect of a technology in the short run and underestimate it in the long run. This is as true now as it has ever been, and Christian caution about emerging technologies is not fear dressed up as virtue so much as it is, historically speaking, a reliable instinct about how similar issues have unfolded in the past.
Social commentator Derek Thompson was correct when he quoted William Goldman’s now-famous insight about the movie business that “nobody knows anything” and applied it to attempts to predict where any of this AI stuff is headed. That honest uncertainty should give everyone pause, enthusiasts very much included. Willour insists that he is interested in rebuilding society, and that is an admirable goal, but it is one that ought to be undertaken with the full weight of the historical and theological frameworks that have spent centuries laying the groundwork for exactly this kind of critique.
As Jesus famously states in the Gospel of John: “And ye shall know the truth, and the truth shall make you free.” Striving to discern the truth about our built environments should be important to all thinking peoples, especially Christians, and that pursuit does not require you to be a daily user of the thing you are examining. Sometimes it requires the opposite: the distance to see what the inventor, blinded by paternal affection, simply cannot.










