In 1936, when he was just twenty-four years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing's Vision, Chris Bernhardt explains the theory, Turing's most important contribution, for the general reader. Bernhardt argues that the strength of Turing's theory is its simplicity, and that, explained in a straightforward manner, it is eminently understandable by the nonspecialist. As Marvin Minsky writes, -The sheer simplicity of the theory's foundation and extraordinary short path from this foundation to its logical and surprising conclusions give the theory a mathematical beauty that alone guarantees it a permanent place in computer theory.- Bernhardt begins with the foundation and systematically builds to the surprising conclusions. He also views Turing's theory in the context of mathematical history, other views of computation (including those of Alonzo Church), Turing's later work, and the birth of the modern computer.
In the paper, -On Computable Numbers, with an Application to the Entscheidungsproblem, - Turing thinks carefully about how humans perform computation, breaking it down into a sequence of steps, and then constructs theoretical machines capable of performing each step. Turing wanted to show that there were problems that were beyond any computer's ability to solve; in particular, he wanted to find a decision problem that he could prove was undecidable. To explain Turing's ideas, Bernhardt examines three well-known decision problems to explore the concept of undecidability; investigates theoretical computing machines, including Turing machines; explains universal machines; and proves that certain problems are undecidable, including Turing's problem concerning computable numbers.
I love Gödel, Escher, Bach, but GEB is over 800 discursive pages. Turing's Vision is a short translate of Turing's key paper "On Computable Numbers, with an Application to the Entscheidungsproblem". Turing paper conclusively proved a key result in mathematics, that some questions cannot be answered "yes" or "no", but will drift in infinite indeterminability. And second, his model of a simple machine with states and an infinite memory provided a conceptual design for the first universal computers as opposed to electro-mechanical calculating machines.
While no one programs pure Turing machines, instead working with friendly abstractions, the Turing machine is the strongest model of computation that we know how to build (stronger ones involve breaking the laws of physics). Bernhardt offers simple and elegantly explained proofs by contradiction to show the powers and limits of this class of machines. While the arguments are fuzzier than pure math demands, this is also a book that is almost thrilling in its readability, and it's a math book!
This book breaks down Turing's approach to computation, and in doing so gives a glimpse into what "universal computers" really are. I believe this book should be required reading in any high-school math class, anyone who is wondering if computer science is right for them, of anyone wondering why some problems are easy for computers while others are hard. This is written for a regular person, not a computer scientist. There's a bit of logical reasoning and even some counting! in the book, but nothing that should overcome someone who wants to learn what computation is all about.
The author sets out to explain Turing's historic paper: "On Computable Numbers, with an Application to the Entscheidungsproblem". I know, I know, I probably don't need to say any more than that - you're already ordering it off of Amazon. Or, more likely, you're like most of the rest of the world and couldn't care less about some stuffy old paper with a German word in the title. This paper is one well worth being familiar with: it sparked the computing revolution. No, the paper didn't describe how to build a computer, or the internet, it instead showed a way to capture how problems can be approached and solved systematically, and that not all problems have answers that can be computed. That last statement might seem obvious to you, but it wasn't to mathematicians around Turing's time. Many were of the opinion that any problem that could be expressed could be solved, and it was just a matter of time before a proof was developed to show that this was the case universally. Turing went after the "decision problem" or Entscheidungsproblem variant: prove that any mathematical statement is either true or false. Turing showed that there is a third option: that some problems are undecidable; a true or false decision cannot be reached no matter how long you (or a similarly capable computer) try to compute the answer.
This is not the most straight-forward concept to understand, and yet the author lays out a series of brief chapters that works a reader into position to discover this fact for themselves. The author does better than to try to go through Turing's paper section by section, he goes through the quintessential ideas and arguments that Turing used to prove his point. He wisely includes more recent explanations and examples where appropriate, and these anachronisms greatly improve readability. The author also elected to tie the mathematical results in with the birth of the modern computing age, but sticks just to the first few years of actualized computers, and also gives a thoughtful account of Turing's final years.
There are other interesting books about Turing's life and legacy, a bunch related to his time breaking the German Enigma code in WWII, but this the best one I've seen to date that really tries to tackle his core mathematical contribution. To be completely fair, this is not a new subject for me. I first learned about Turing machines in my undergraduate days, and I've spent a good amount of time reading about computability and related topics. That said, I wish I had read this book before my automata theory textbook because it does such a great job with the subject. It's no replacement for a proper textbook if you're aiming to be a computer scientist - I believe getting a real feel for computability and undecidable problems requires spending time building examples of deterministic finite automata, context free grammars, and the like - but this book really gets the reader into the right frame of mind to do that sort of thinking.
Two minor improvements for the book: 1. The term algorithm is used in the introduction, but it's not clearly defined. I think adding the word to a sentence a few back from where it is first seen should clear up any confusion: "He realizes that any computation can be broken down to a sequence of simple steps, *called an algorithm*." It might not be the most rigorous definition of the term, but functionally I think it works. 2. Typo on page 84: There's an equation that translates a binary number into a decimal one - but an exponent of 7 is used where a 6 should be: Printed: 1 x 2^7 + 0 x 2^7 + 0 x 2^5 ... Correct: 1 x 2^7 + 0 x 2^6 + 0 x 2^5 ... It doesn't change the result, but if a reader is unfamiliar with different number systems it might be a little confusing.
Damn this book was good. I wish I read it while I took Introduction to Complexity Theory in college. Bernhardt explains some pretty challenging concepts in a very digestible way. On top of that, Bernhardt gives a great history lesson and helps you appreciate why these seemingly esoteric ideas are so important so impactful
Oof. My 2 star rating should not necessarily be regarded as me saying that this book is “bad”… just not what I was expecting. I had hoped for a more broader, general-interest and abstract overview of the concepts, rather than such nitty gritty details. I’m grateful that the world is populated with those who find such dry and demanding thought enjoyable, because it has changed life as we know it dramatically, but for me finishing this book was a matter of stubbornness, an exercise in self-discipline, and a little masochism thrown in.
An excellent short and mostly informal introduction to the theory if computation and related topics, with some historical background material. Highly recommended to students but an interesting read for specialists as well. The text was not edited carefully, though, as it contains a number of typos, double words or superfluous words.
“We shall do a much better programming job, provided we approach the task with a full appreciation of its tremendous difficulty, provided that we respect the intrinsic limitations of the human mind and approach the task as very humble programmers.” - Alan Turing
This is not a book for the interested layman. I have masters degrees in both computer science and in mathematics. It took me two tries to get through this. The first half of the book is about the mathematical development and underpinnings of the theoretical Turing Machine. I read this with a pad of paper and pencil nearby to make notes about the construction of Post's correspondence problem, finite automata, the equivalence between finite automata and the description of their algorithms using strings of 1's and 0's, the expansion to Turing Machines etc. It was fascinating but required close concentration. Unfortunately there were a couple of cases (unavoidable, I know) where the author simply said things like "Post proved that the correspondence problem is undecidable" without offering a proof. The author then went on to prove other things based on this statement. I would have like to have seen the proof but this could lead to the desire to see more proofs going all the way back to the axiomatic development of all of math. So I just pressed on.
I didn't find the second half so interesting, primarily because it was material that was familiar to me (Hibert's Hotel, proofs by contradiction, diagonal arguments and the cardinality of natural and real numbers). It led up to a section about the number of computable numbers. I didn't really feel that it was particularly pertinent to the first half of the book. The very end of the book was a brief history of Turing's work and the Apology and pardon given to him posthumously for his treatment by the British "justice" system for being gay.
If you like logic, have a strong background in math or are an assembler programmer this might be the book for you. If you are just a bit curious about Turing, find something a lot less technical.
This is very readable introduction to computability, a difficult yet rewarding area of computer science. While not going very in depth, it's a wonderful stepping stone, and well worth reading before tackling some of the more involved books.
I enjoyed every sentence of this book. The book acts as a prism, splitting computation into its most fundamental constituents. The author managed to distill extremely complex ideas without sacrificing precision, which is no easy feat. The concepts are explored thoroughly with clean, clear writing.
You'll quickly notice when things become challenging as confusion sets in. This signals that the section requires deeper contemplation from the reader, perhaps even necessitating to go off on a tangent to study related topics before continuing.
I consider a book successful when it opens new pathways for the mind to explore; like a museum, each section reveals something new, creating a yearning to explore more.
I failed this book. Despite my sincere interest in math, logic, and Turing, I was not able to get the most out of Bernhardt's deftly written explanation of Turing's all important thesis. I am not ashamed to say that a smarter person or a more dedicated reader than myself, particularly in the field of math or logic or computer science, will probably fly through this. I have great respect for Bernhardt's commitment to translating this dense subject for a person who is unfamiliar with the field. However, this book deserved more of my attention, as I occasionally grabbed pencil and paper to bravely attempt to follow the method to the Turing machine madness, or more often, ignored it entirely for months.
But even for a dumb dummy like myself, I did thoroughly enjoy the last section on Turing's Legacy, which speaks loudly even for those who are unable to grasp the nuances of his logic and intellect. Turing's innovation, influence, and ripple effect are undeniable.
This is an interesting introduction to some of the key mathematical work of Alan Turing (mostly his 1936 results on Computable Numbers and Turing Machines) relating it to both contemporary developments in logic (the lambda calculus) and later developments in computer science and related ideas like finite automata. The book is thus mostly an introduction to mathematical and computer science concepts but serves to give them some intellectual context also and even a little bit of social context for Turing's life and legacy.
The book begins with some discussion of mathematical logic, Hilbert's program (the Entscheidungsproblem) and the Godel results that Turing's work built on, responded and contributed to. Things like finite automata and the lambda calculus are worked though in preparation to lead the reader through Turing's laying out of his model of computing in terms of Turing machines and proof that the computable numbers are not effectively innumerable. Also other mathematical work such as Cantor's diagonalization argument are sketched out and worked through to the extent necessary to relate them to Turing's work. Some key elements of Turing's life and other work are summarized as are a few seminal instances from the origin of the computer, some philosophical issues around artificial intelligence and a final reflection on Turing's life and legacy.
The book is a fairly readable and understandable exposition of the mathematics. However at times I found the examples and work through of the problems a bit brief. I had some rather haphazard knowledge of the mathematics in Turing's work including my own attempt to read Turing's paper, but much of the mathematical logic and machine theory was new to me. There is a difficult balance between being comprehensible to the general reader (or at least the interested neophyte) and avoiding tedium in explaining the very simple but elaborate procedures involved. Also the depth and breadth of the discussion is limited. Careful reading would no doubt be rewarded with greater comprehension but I found I profitably read it without having to master all the technical nuance presented.
The history presented is often very schematic outside of tracing the pedigree of certain mathematical techniques there are just a few bits of greatest hits. The reader may come away with a somewhat inflated sense of Turing's role in the origin of actual computing machines (Turing's exact contributions to various elements is a matter of some dispute), a few interesting early machines are left unmentioned in the cursory round up of early computers but otherwise I did not notice any serious errors.
Maybe I need to stop snapping up the discount books on "BookBub" that sound interesting.
This will be a very narrow interest book, most readers should avoid. It is (and effectively so) preparing the reader to understand what Turing's most important mathematical work really means, both for math and for computer science. If a discussion of mathematical proofs (nothing heavy), finite state automata and some discussion of set theory, operations on infinity and so forth sounds horrifying, this is *not* the book for you. I considered myself well grounded in both Computer Science and its mathematical foundations and still, this was not a simple read.
My education did not include automata theory (typically taught as a precursor to compilers, which I also didn't study) so this was a good introduction. My experience with proofs (outside of what is written on a bourbon bottle!) was not positive, so I had to work to follow what I knew were very relatively basic arguments. I have enjoyed discussions around operations on infinite sets (Hilbert's famous "infinite" hotel gets covered well in this book), so those were enjoyable in this book as well. Consider the challenge of a hotel with an infinite number of hotel rooms, which suddenly faces the arrival of additional guests - how do we get them to an empty room? Clearly, an infinite hotel always has room for infinitely more guests, right?
This is not a history of Turing, though his work in the field is covered briefly near the end of the book. It is a reminder of just how much he contributed before his relatively young death.
In the end, if you want a solid grounding in automata, Turing machines and what it truly means when somebody says "the halting problem", this book will be a very good read. For everybody else, you likely will be unhappy with much of the book.
Bernhardt provides a brilliant overview of Alan Turing's most groundbreaking contributions. The only prerequisites for this read are reasoning skills and familiarity with mathematical notation (if you know '>' is the greater-than sign and that {a,b,c} is a set, you're golden). The book gradually builds on itself, starting with logic and ending with the core ideas of Turing's revered paper.
A clear explanation of the overall goals of early 20th century mathematicians is provided as well. So, by the last page, not only do you come out with a solid understanding of Turing's ideas but also the context and motivations of his work.
If you've read and enjoyed "Godel's Proof" by Nagel and Newman, you'll enjoy this. Conversely, if you've already read this and are looking for something similar, look no further than "Godel's Proof"!
This is really a cool book about theoretical computer science! Recommended to anyone who wants to learn about theory of computation without going through an in-depth course.
This is a slightly technical book, so you need to have some basic maths if you want to make the most out of it. The book goes into a number of proofs, some of them by contradiction. Basic logic and set theory might be enough to understand this book.
Some of the interesting topics treated in this book are:
- What are the connections between logic and computation? - What's the mathematical definition of computing? - Are there questions to which we cannot build an algorithm for? (spoiler: there are, and it's been proven mathematically) - Are Turing machines just a theoretical model, or do they have a connection with modern computers?
Excellent primer for curious people, who do not have a technical background, to know about who Alan Turing was and what his contribution was to the modern technical landscape. We owe a great debt to his genius for inventing the "theoretical computers" which he called the "universal machines" which started the genesis of all modern computing devices; almost every gadget in sight, our computers, phones and other digital systems.
Most people know Alan Turing as the code-breaker, who shorted the WWII and saved countless lives by cracking open Enigma encryption used by Nazi Germany. However few know that it was Alan Turing who pioneered the field of computer science. This book gives an account of how and why!
Bernhardt does an admirable job of demonstrating and explaining the huge contributions Alan Turing made to both mathematics (especially set and number theory) and the foundation of what we now call computer science. The math and logic are not particularly “difficult” to understand, but I hadn’t encountered much of the notation (sets especially) and thinking (mathematical proofs) since high school math class. The implications of Turing’s logical mathematical arguments are stunning, especially as they relate to computer programming and calculation. Beware, this is a bit of a slow read due to lots of math!
Wow I wish I had this book when I took my Theory of Computation course. My prior experience made the content easier to follow, but there is a considerable amount of auxiliary information which is sometimes difficult to distinguish from the essential information. The author seems to propose as his thesis that he will be explaining Turing's ground breaking paper, providing all the necessary background information (and more). With the side-topics and the cumulative nature of the background information, I feel like the author should have reinstated and summarized the content. Without such a summary, I was left at the end of chapter 8 thinking "oh, we're done?"
Bernhardt did a great job of providing a great overview of computational theory, specifically the vast influence that Turing had on the field. I particularly enjoyed the way certain proofs were explained using easy to understand examples, before the official proof was introduced. I also enjoyed the asides throughout the whole book.
That being said, I do have a CS background, and do not think I would have been able as nicely without it. But given the difficulty of the topic, Bernhardt does a fantastic job!
This book will give immense pleasure to anyone interested in the field of Theoretical Computer Science. With a brief introduction with regards to the works of mathematical logic by Russell, Hilbert and Godel, then moving on to topics such as finite automata, Turing machines, undecidable problems and finally countable numbers, this book provides a remarkable introduction to the methods employed by Turing to answer the question of the Entscheidungsproblem.
I loved this book :) If you want to understand the theory surrounding the origin of computers, this book is ideal.
Since it covers some complicated topics, more advanced math skills are necessary. However, the author explains everything in an accessible way, so with a little patience it is possible to understand even the most difficult concepts.
The author has an amazing ability to explain for people like me that have a little background in mathematical logic. Out of the chapter on Lambda calculus in which explanations aren't long enough (I needed to learn from other books and spent a lot of time), the book is very understandable and deserves more than five stars.
Terrible book, the only good part about it is the last chapter where it was about the personal life of Turing, other than that for the technical parts, it's horrible, so bad at explanation and missing a lot of contexts. One of the worst biographies, if it's a one that I have ever read. And, if it's a technical book, it would be the worst too.
I knew a lot of the Bletchley history and had read the imitation game paper, but never read the computable numbers paper. This book covers that in a very accessible way for a non computer scientist. Turing machines, encodings and the halting problem.
This seems to be a very underrated book. Besides Godel Escher Bach, this book gave me more insight into what computing really is than any other I've come across, and I'd strongly suggest it to anyone entering the field.
This book talks about Turing's contribution to theoretical computing fundamentals. It is a bit technical but fun. The idea of the Turing machine really enlightened the development of modern computers.
I have to admit that I couldn't grasp some of the details of the concepts put forward by Turning, but even though I have read a few books about him I learned a bit more of his genius capabilities and even the interesting possibility that he didn't die from suicide.
This entire review has been hidden because of spoilers.
Interesting stuff. Took a course on state machines before reading it, so a big partnof the content I already knew. But, a really good book for cs students or anyone interested in learning more in depth about computers.
Much of the book is unreadable for someone without an advanced degree. The historical anecdotes are sporadically interspersed and helpful but the magnitude of Turing's impact is not truly felt.