On the Origins of Dune's Butlerian Jihad
Some notes on what should go in our own Orange Catholic Bible.
Nearly one hundred years before Frank Herbert published “Dune” and teased its Butlerlian Jihad—the Great Revolt against computers, thinking machines, and conscious robots that some humans used to enslave humanity (who were, in turn, enslaved by a "god of machine-logic")—there was the Butler that inspired it all: Samuel Butler, a 19th century English novelist who was one of the earliest thinkers to try and apply Darwin’s theory of evolution to the possibility of machine intelligence.
In 1863, four years after "On the Origins of Species” was published, Butler sent a letter to the editor published in The Press, a New Zealand daily newspaper, titled "Darwin among the Machines.” In it, Butler posits that machines could be thought of as "mechanical life" undergoing evolution that might make them, not humans, the preeminent species of Earth:
We refer to the question: What sort of creature man’s next successor in the supremacy of the earth is likely to be. We have often heard this debated; but it appears to us that we are ourselves creating our own successors; we are daily adding to the beauty and delicacy of their physical organisation; we are daily giving them greater power and supplying by all sorts of ingenious contrivances that self-regulating, self-acting power which will be to them what intellect has been to the human race. In the course of ages we shall find ourselves the inferior race.
Butler was looking at the monstrous wake of the Industrial Revolution, struggling with the implications of Darwin’s theory, and concluded that the evolutionary pressures advancing machines were even more intense than humans—happening on much shorter timescales that yielded much more dramatic effects because of our intervention—suggesting that consciousness and intelligence would eventually arise. Our succession was a foregone conclusion: the question then was how, not when. What would bring that day to pass?
Butler writes:
Day by day, however, the machines are gaining ground upon us; day by day we are becoming more subservient to them; more men are daily bound down as slaves to tend them, more men are daily devoting the energies of their whole lives to the development of mechanical life. The upshot is simply a question of time, but that the time will come when the machines will hold the real supremacy over the world and its inhabitants is what no person of a truly philosophic mind can for a moment question.
Could anything be done to stave this off? Butler said yes:
War to the death should be instantly proclaimed against them. Every machine of every sort should be destroyed by the well-wisher of his species. Let there be no exceptions made, no quarter shown; let us at once go back to the primeval condition of the race.
Butler would take this letter and a few other writings to develop The Book of Machines, chapters 23-25 of his 1872 social commentary novel “Erewhon”. The novel itself is a funny satire of Victorian society and this section was initially read as a mockery of Darwinian evolution, but Butler makes clear in a later letter to Darwin that it was more of a jihad of his own against the theologian and Christian apologist Joseph Butler (if we refer to this Butler again, we will call him Butler 2), an Anglican bishop who’d published “The Analogy of Religion, Natural and Revealed” one hundred years earlier:
When I first got hold of the idea, I developed it for mere fun and because it amused me and I thought would amuse others, but without a particle of serious meaning; but I developed it and introduced it into Erewhon with the intention of implying: ‘See how easy it is to be plausible, and what absurd propositions can be defended by a little ingenuity and distortion and departure from strictly scientific methods,’ and I had Butler’s Analogy in my head as the book at which it should be aimed, but preferred to conceal my aim for many reasons.
You should read the novel yourself but there are a few quotes in there I think worth teasing out that are clearly building upon ideas Butler is grappling with in that first essay, that are cleanly ported over to the Butlerian Jihad. Here are two passages, first:
"True, from a low materialistic point of view, it would seem that those thrive best who use machinery wherever its use is possible with profit; but this is the art of the machines—they serve that they may rule. They bear no malice towards man for destroying a whole race of them provided he creates a better instead; on the contrary, they reward him liberally for having hastened their development. It is for neglecting them that he incurs their wrath, or for using inferior machines, or for not making sufficient exertions to invent new ones, or for destroying them without replacing them; yet these are the very things we ought to do, and do quickly; for though our rebellion against their infant power will cause infinite suffering, what will not things come to, if that rebellion is delayed?"
and second:
"But returning to the argument, I would repeat that I fear none of the existing machines; what I fear is the extraordinary rapidity with which they are becoming something very different to what they are at present. No class of beings have in any time past made so rapid a movement forward. Should not that movement be jealously watched, and checked while we can still check it? And is it not necessary for this end to destroy the more advanced of the machines which are in use at present, though it is admitted that they are in themselves harmless?"
Butler mocks the prevailing Victorian attitude of the time—blind faith in science, reason, progress, and profit—as a "low materialistic point of view" that believes mindlessly adopting and advancing technology is a moral good and an inevitable process akin to the march of time. It's through a wrongheaded belief in profits as the ultimate signifier of value that Erewhonians created an elaborate system whereby they feel in control of their society and their culture, though are in truth slavishly dedicated to their machines above all else. Failure to do so incurs their "wrath", which is little more than a mockery of capitalist competition—neglect new tech, use inferior machines, fail to innovative, and you will be punished by impersonal forces bent on impoverishing (and eventually killing) you.
Suspicious as they might be about their machines, Erewhonians were unable to live without them and unwilling to entertain thinking about lives where their relationships to the machines were any less dependent. And so you have a lone Erewhonian philosopher (the narrator of this section) insisting that while the machines clearly pose no threat today or tomorrow, this is the only time when revolt will be possible. The door will close, their utility and seductiveness will only grow—eventually to the point that the machines will no longer need to rely on the advocacy of those enslaved by dependency, they will simply act in their best interests. Such a revolt in Erewhon will cause a great deal of suffering, but what is that measured against the smothering of humanity's spirit?
A key excerpt:
"How many men at this hour are living in a state of bondage to the machines? How many spend their whole lives, from the cradle to the grave, in tending them by night and day? Is it not plain that the machines are gaining ground upon us, when we reflect on the increasing number of those who are bound down to them as slaves, and of those who devote their whole souls to the advancement of the mechanical kingdom?”
This section captures what I think is at the core of Butler's and Herbert's warnings about technology. A world where we prioritize the relentless advancement of technology and a universal dependence on it in the name of efficiency is a world where we prioritize a certain political-economic order that is more interested in advancing technologies based on criteria that have little to do with human flourishing, instead being much more interested in financing and designing and deploying them against people—in organizing the greater whole of humanity such that they are more profitable and less likely to revolt against an arrangement that is incredibly lucrative for increasingly few.
In one of the essays in my AI series, I argued that Luther's critique of indulgences in the medieval era could be applied to today's Silicon Valley Consensus. Luther was not opposed to indulgences so much as their abuse, which cheapened repentance and undermined attempts to compel good works or genuine attempts to right wrongs. The idea that salvation could be realized through a transaction convinced many they'd obviated the need for the hard work of being a better person. Indulgences also centralized and codified unjustified power grabs by the Church, which claimed new authorities over souls in Purgatory and introduced perverse incentives to prioritize activities that had nothing to do with Christendom.
In some ways, I think of Luddism (and Butlerianism) similarly. My concern is not technology in of itself (though there are multiple technologies we would do better off without). Technology, however, is downstream of politics and economics and history and social relations. We aren’t saying destroy the clocks before they become killer drones, but we are saying the killer drones are already here and we should figure out how to destroy them. Clearly, technological dependence obscures the political and economic decisions about what sort of technologies should be developed, how they should be financed, who should finance their development and reap their rewards and bear their costs, and how society should be organized around the facts of those arrangements.
Is the solution more or less democratic control over technological development and deployment? Do we trust today’s major players in this space to truly prioritize anything other than profits and returns? Are we going to be able to realize or experiment with other values, arrangements, and models that prioritize anything else within today’s authoritarian technological system or within a democratic system? If we realize that certain paths or arrangements or products or models go against human flourishing or the public good or our ecological niche or the mental health of the general public (realizations we have already made), will we be able to do anything about it?
I want to end on an exchange that I think encapsulates this thread, at least, of my personal Luddite philosophy—an interview between Bill Moyers and Noam Chomsky in 1989:
NOAM CHOMSKY: Well, we now face the most awesome problems in human history- problems such as: the likelihood of nuclear conflict, either among the superpowers or through proliferation; the destruction of a fragile environment, which finally we’re beginning to recognize, though it was obvious decades ago that we’re heading for disaster; other problems of this nature. They are of a level of seriousness that they never were in the past.
BILL MOYERS: But why do you think more participation by the public, more democracy is the answer?
NOAM CHOMSKY: Because more democracy is a value in itself, quite apart- because democracy is a value. It doesn’t have to be defended any more than freedom has to be defended. It’s an essential feature of human nature that people should be free; they should be able to participate; they should be uncoerced, and so on. These are values in themselves.
BILL MOYERS: Why do you think, if we go that route-
NOAM CHOMSKY: Because I think that’s the only hope that I can see that other values will come to the fore. I mean, if the society is based on control by private wealth, it will reflect the values that it, in fact, does reflect; the value that the only real human property is greed, and the desire to maximize personal gain at the expense of others. Now, any society- a small society based on that principle is ugly, but it can survive. A global society based on that principle is headed for massive destruction. And that’s what we are. We have to have a mode of social organization that reflects other values that, I think, are inherent in human nature that people recognize.
BILL MOYERS: And that would be? I want to see exactly what you mean.
NOAM CHOMSKY: I mean, what are human beings? In your family, for example, it’s not the case that in the family every person tries to maximize personal gain at the expense of others, or if they do, it’s pathological. It’s not the case that—if you and I are, say, walking down the street, and we see a child eating a piece of candy and we see that nobody’s around and we happen to be hungry, we don’t steal it. If we did that, we’d be pathological. I mean, the idea of care for others and concern for other people’s needs and concern for a fragile environment that must sustain future generations; all of these things are part of human nature. These are elements of human nature that are suppressed in a social and cultural system which is designed to maximize personal gain.
And I think we must try to overcome that suppression and that’s, in fact, what democracy could bring about. It could lead to the expression of other human needs and values which tend to be suppressed under the institutional structure of a system of private power and private profit.
BILL MOYERS: Do you believe that, by nature, human beings yearn for freedom, or do we settle in the interest of safety and security and conformity—do we settle for order?
NOAM CHOMSKY: These are really matters of faith rather than knowledge. On the one hand you have the Grand Inquisitor who tells you that what people, what humans crave is submission, and, therefore, Christ is a criminal and we have to vanquish freedom. That’s one view.
You have the other view of, say, Rousseau in some of his moments, that people are born to be free, and that their basic instinct is the desire to free themselves from coercion, authority and oppression. The answer to which you believe is, more or less, where you stake your hopes. I’d like to believe that people are born to be free, but if you ask for proof, I couldn’t give it to you.


"Technological dependence obscures the political and economic decisions about what sort of technologies should be developed," yes. I like to compare tech dependence to addiction.
Addiction speaks in the voice of reason: "you haven't relaxed like this in so long, have another drink"; "you might miss a message, check your phone." Derangement-by-reason (or a façade of reason, anyway) is really hard to detect and identify for what it is, because the forms it takes are so damn reasonable. Hard to dissect.
I don't think that historical Luddism / Butlerianism is prepared to deal with tech dependence as it manifests in the human psyche today. But they're still essential reference points - I'm glad people are out there reading and interpreting these canons for the rest of us.
I never watched or read Dune, I had assumed Butlerian Jihad had something to do with Judith Butler. I learned something today and I feel a bit silly.