Jump to ratings and reviews
Rate this book

Mechanizing Proof: Computing, Risk, and Trust

Rate this book
This series presents recent research on the effects of taxation and government expenditure programs on economic performance and analyses of the effects of potential tax reforms. The research results appear in a form that is accessible to tax practitioners and policymakers.

440 pages, Paperback

First published March 1, 2004

3 people are currently reading
80 people want to read

About the author

Donald Angus MacKenzie

7 books17 followers
Donald Angus MacKenzie FBA FRSE FAcSS (b.1950) is a Professor of Sociology at the University of Edinburgh, Scotland.

His work constitutes a crucial contribution to the field of science and technology studies. He has also developed research in the field of social studies of finance. He has undertaken widely cited work on the history of statistics, eugenics, nuclear weapons, computing and finance, among other things. Works

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
13 (50%)
4 stars
10 (38%)
3 stars
3 (11%)
2 stars
0 (0%)
1 star
0 (0%)
Displaying 1 - 3 of 3 reviews
Profile Image for youzicha.
26 reviews5 followers
Read
January 5, 2021
MacKenzie's job title says he is a sociologist, and in the preface he namedrops fashionable continentals and gives a quite convincing spiel about why sociologists should be interested in mechanized logic. Then the rest of the book never mentions any sociology again; it is 100% history, and quite nerdy history at that. MacKenzie understands the things he writes about, and even includes proof sketches for some key lemmas of the four-color theorem. I also like that he always very reasonable. The preface starts out by a mild-mannered plea to not go all Science Wars on him, and in the body of the text, which touches on a number of viscious debates and disagreements, he always seems fair to both sides.

The book covers a quite long time period, roughly 1950-2000. I began to study computer science soon after MacKenzie leaves off, so I personally thought it was interesting to see that early history eventually link up with what I think of as modern times. Quite a few of the topics he describes were already familiar to me from textbooks, but it's a different perspective to read a narrative rather than the "timeless" theorem-lemma-theorem view of a math textbook or a perfunctory string of citations in a related-work section. The book doesn't have any overarching theme or point, each chapter is a stand-alone description of some particular episode or topic in the history of mechanized proof, so it's not possible to summarize it, but I do feel generally better informed about the history of the field after reading it.

I was particularly interested to learn that the military (mostly NSA) funded formal verification research in the 1970s and 80s in the hope of creating hacker-proof systems (c.f. the Orange Book). A lot of what is now basic definitions or algorithms originated with military funding, and the vision was strikingly similar to current work. Their plans from 1982 sound exactly like what people in the field describe today, but it wasn't until the 2010s that systems like seL4 started to realize it.

And yet, maybe we should breath a sigh of relief that the DoD failed, because their approach seems like a twisted nightmare version of what formal verification is today. Everything was managed in a top-down, hierarchical manner, with timelines set years in advance. Currently all the major theorem provers (Coq, Isabelle, Z3, ...) are open source and developed by international teams, but back then the source code was classified and export-controlled as a kind of munition. The people working on verifying the Operating Systems etc could only read the source code on a need-to-know basis, and had to work in classified secure labs, which feels perverse if you are used to thinking about mathematical proof as a form of communication and meeting-of-the-minds. And while the field today is dominated by a bunch of very eccentric characters, mainly employed as university faculty, back then the proofs and tools were developed by Raytheon-style military contractors. If the DoD really had managed to produce their secure OS kernel &c, maybe we would now think of formal verification as soul-destroying bureaucratic busywork where you labor 9-to-5 to tick all the boxes on a government RFQ.
93 reviews
July 28, 2025
I appreciate MacKenzie's scholarship, but a lot of this book felt like a meandering slew of interview summaries and 2-page biographies which were not strongly tied to an overarching narrative. Exceptions include the four-color theorem chapter and the final chapter.

Also, I wish this book was written much closer to the present day. The nature of proof assistants has changed drastically since 2004, and some "serious mathematicians" like Terry Tao now use assistants like Lean to check their work, and I would enjoy learning about this development.

Finally, I think it's worth noting that this book focuses more on history and sociology than it does on technical or philosophical matters. This fact is made clear in the introduction, but I didn't pick that up from the cover or the back summary.
Displaying 1 - 3 of 3 reviews

Can't find what you're looking for?

Get help and learn more about the design.