How much do you trust a computer? The answer to this question has probably changed somewhat since 1976, and the relevance of this book has slipped. Now they're all around us, they are commonplace, invasive. So, to appreciate Weizenbaum's book fully, get in a time machine--to a time when you needed to schedule time to be with a computer, the computer had a cabal-like group of attendants to help you make requests of it, and this computer had a presence--like a guru on a mountain, you came to it.
Now then. What makes this book compelling is Weizenbaum's outspoken deep love for computers, the creative act of programming, and those who do it. He very nearly slips into the first person in the chapter where he describes the “compulsive programmer,” the poor soul who is driven by the desire--the need, actually--to create artificial minds. "It is a thrill to see a...program suddenly come back to life; there is no other way to say it." [120].
Weizenbaum is the author of the famous ELIZA program, a simple, elegant English-language parser which, for the course of a brief, casual conversation, might to carry on an intelligent conversation with the user. Its most popular function is that of a psychoanalyst, parroting its patients’ statements back in question form. (User: "I ate my bicycle." ELIZA (after several seconds of computation, no doubt): "Why do you think you ate your bicycle?")
This silly program, ELIZA, has given Weizenbaum first-hand experience in observing the bizarre, irrational, and psychologically complex way ordinary humans interact with machines. He tells stories of laypeople who, even with a thorough understanding of how the program works--how each word in its output is determined solely by the human input--feel that ELIZA really, somehow, cares for them. Users actually requested private sessions with the program in order to discuss more personal matters.
This is just human nature, though, our willingness to succumb to illusion. Weizenbaum's purpose in writing seems like a magician coming clean, showing his audience the secret compartments and trapdoors. Nothing supernatural here. He spends the first three chapters explaining computers, game theory, and Turing machines using extended metaphors, guiding his reader step-by-step through the processes by which a procedure (not the physical computer, but the essence of one) can 'think' and perform complex tasks. He emphasizes over and over how each action is fully deterministic; the computer can not ever choose to act or initiate any action itself.
Weizenbaum gradually makes clear the purpose of this meticulous tour. The layman's perception of the computer as a sort of super-human (again, this is 1976) is beginning to have seriously dangerous consequences. Like the secretary who requested a private audience with ELIZA, government officials, psychologists, and bioengineers see the computer as having abilities beyond processing logic. Industry leaders push the computer as the most important innovation, ever: specifically promoting the idea that computers are less fallible than humans, and will ultimately be able to do anything humans can, faster and more accurately.
While this technological optimism may have faded some since the '70s, there is something else Weizenbaum describes that is absolutely timeless: a fatalistic attitude towards progress. Scientists tell us that within so many years technology will be able to do such and such: fly an airplane, understand spoken English, integrate with animal brains and optic nerves to create new, hybrid life forms. We feel that the only real test of progress is its ability to amaze: "the validation of scientific knowledge has been reduced to the display of technological wonders" [265].
Weizenbaum loathes this state of affairs. Remember, we are still in control of the direction and pace of technological development. Scientists are pressing forward without regard for "higher" principles or the possibility that there exist things that we can, but perhaps should not, do. They all seem to have the innocence, the obliviousness, of the monomaniacal, compulsive programmers he has described earlier.
This reactionary stance--the opposition to the amorality of science--is certainly not unique to Weizenbaum. But in combination with a loving look at computers, it makes this book unique.
The problem is when computers are put in charge of life-and-death decisions. Weizenbaum cites many cases where people have come to a point of crisis and a major change needed to be made, some dealing with large segments of the population. The advent and hype of computer technology has convinced the decision-makers that what they need is to apply more computer power, replacing human tasks with machines which do the same. Our fundamental thinking here is wrong, he states. Instead of sitting down and trying to find a better way of doing things, using our human intuition and initiative, we now have the option of throwing technology blindly at the problem.
For example: Instead of dealing with the problems related to America’s car obsession by, say, promoting mass transportation, computers have simply made the mass-production of vehicles the easiest solution.
Instead of finding alternatives to going to war in Vietnam, we used computers to help automate the location of strategic targets and to convey information (and misinformation) from the front.
Weizenbaum reaches the surprising conclusion that the invention of the computer has actually had a conservative effect on our nation’s systems: it has "immunized" us "against enormous pressures for change" [31].
Here's where Weizenbaum changes modes, and there's a problem with his moralizing assertions. He lays out what he believes technology should not ever do: it should never try to substitute computer power for a purely human function and it should never take upon itself a task unless it meets a human need not readily solvable any other way, and whose side effects are entirely foreseeable. The problem is that he provides no logic to support his thesis. Instead, he seems to think it should be obvious that to substitute mechanisms for human functions is "immoral." He does mention historical precedent--new technologies introduced for benign purposes seem to end up promoting warfare [269].
No longer is he the didactic professor of computers and language theories: he makes a passionate plea to his readers, computer scientists and teachers, to think for themselves. It is a reminder of free will. "People have chosen" to make things exactly the way they are today, and our choices will effect the future [273]. We are already thinking too much like our machines if we believe that the progress of society is a behavior as predetermined as the progress of a computer’s algorithm.
Weizenbaum does not provide extensive logical proofs for his statements; nor would that be effective, considering that a fundamental part of his appeal, underlying the entire flow of the book, is that we have gone wrong by solely placing our faith in quantitative studies, numbers and logic. Hard science is not the only source of wisdom: he mentions J. S. Bach and Arthur Miller. It is a bit surprising that a computer scientist would endorse musicians and playwrights as sources of truth as valid as mathematical truth. But something about this humanistic message rings true: we are human beings, not calculators--and it's worked out pretty well for us so far. Our free will, creativity, intuition, and initiative are things that are exclusively human, can never be automated, and should be trusted and preserved.
His audience and its attitude has probably shifted, as I said earlier--Weizenbaum expects his readers to be optimistic about computers’ increasing role in society and about scientists’ ability to make computers smarter and faster and more human-like. Well, maybe despite the mundaneness of computers and their accelerating intelligence, we still are. We have had more time to get bored. If anything, we are impatient for them to get get smarter. But smarter in making smaller decisions: whether to re-route us around an accident on the interstate, recommending a movie or a taco stand... these micro-decisions, multiplied, are directing the tides of human traffic, but maybe not in the way Weizenbaum observed.
Weizenbaum describes centralized computing facilities where the programmers would work, and sometimes sleep. Perhaps this separation of computer scientists from the laity gave their work a sort of aura, but by now most of us do not even have to stand up to confront a computer. This proximity and familiarity with computers has helped us understand firsthand a truth that Weizenbaum describes in a more abstract way: computers are frustratingly impersonal, dependent, and about as understanding of human needs as an blender. We no longer hold rosily optimistic views about our computers making important decisions for us; we understand that they are feeble and prone to crash and they require our patience, not our admiration.