Part of Sterling’s extremely popular Milestones series, this illustrated exploration of computer science ranges from the ancient abacus to superintelligence and social media. With 250 illustrated landmark inventions, publications, and events—encompassing everything from ancient record-keeping devices to the latest computing technologies—this highly topical addition to the Sterling Milestones series takes a chronological journey through the history and future of computer science. Two expert authors, with decades’ of experience working in computer research and innovation, explore topics including the Sumerian abacus, the first spam message, Morse code, cryptography, early computers, Isaac Asimov’s laws of robotics, UNIX and early programming languages, movies, video games, mainframes, minis and micros, hacking, virtual reality, and more.
Computers have come a long way since they were first conceived back in the 20th century. There are many more than 250 milestones, but that is the focus of this book. It covers both the theoretical and the practical in making a working computer system. Take my story for instance. When I was a child, I had a Commodore 64 computer. I didn’t really know how to work it and I didn’t have many programs or things to do on it. Thus, I didn’t really get into computers until the 1990s with Dial-Up modems their nostalgic sounds. I bought a lot of desktops since they are easy to fiddle with and don’t really overheat unless you overclock the CPU or don’t have adequate cooling. The other reason was that it was cheaper to get a desktop computer when I was in college. Now I have both a desktop and a laptop computer since I still have a number of useful things on my desktop.
Computers are so ubiquitous that I don’t think I need to say this, but a computer is made up of both hardware and software. The hardware is the components that make it up, the CPU, the ALU, the Bus systems, and the Motherboard. There are more components, but I don’t really need to provide a complete list. The software is the program. It is the set of instructions that tell it how to produce a game for you to play or how to respond to input.
The book starts at the Abacus, the basic counting device that some Asian Nations still make use of to this day. Not surprisingly, there aren’t that many advancements made in computing until the advent of Electricity and Boolean Algebra. Both of these things allow for simplified logic gates and electric powered components. This book is similar to the others in this series where one page contains all of the text and the other page has a pertinent image or picture. At the bottom of the page that has the text, there are references to other advancements. Since a lot of Computing History was focused on Cryptography and Codes there is a great amount of the book centered on the developments of those techniques to break them and use them.
The book doesn’t really have them organized by subject or anything, they are all chronological. Some of the advancements came earlier than I thought they did. Electronic Speech Recognition was developed in the 1950s. I did not realize that. Plus there are a lot of firsts depending on what you are looking for. For instance, when it comes to the first computer, there are quite a few depending on what category or type it is. Like the ENIAC is the first completely electronic computer. Or is it? I mean, I think the Colossus was first in that but it wasn’t widely known. Then there were patents that actually stymied growth. The same thing can be said for the secrecy of the British after World War II.
The most obvious advancements for a computer would be the components and what makes up those components. Initially it was a series of gears with the Difference Engine and Analytical Engine of Charles Babbage. Then scientists and engineers developed mechanical relays that operated with electricity. After that came Vacuum Tubes, allowing for more speed. Once Bell Labs came out with the Transistor in 1949 it was realized that they could be used to miniaturize the circuits and devices. Then we finally arrive at the Integrated Circuit. The computer is not merely its components though, it is also the programs and instructions that tell it what to do. So the book includes the Jacquard Loom, the hole punched instructions that allowed powered looms to make Brocade and other complicated fabric types. No book on computers would be complete without a reference to Augusta Ada King-Noel, Countess of Lovelace, the first programmer.
Finally, the book contains advancements in the field of Artificial Intelligence. From pop culture to real life, it includes stuff on Asimov’s Three Laws of Robotics, the first coining of the term ‘Robot’, an AI proving a Mathematical Theorem, Robby the Robot and other things. The last entry is on the Limits of Computation, which are coming pretty soon unless we get Quantum Computing.
In some ways the book is informative, but in others, I already knew a lot of the subject. However, that did not detract from my enjoyment of this book.
The Computer Book is a marvelous compilation of the history of computing, but it has a few serious omissions. Fortran, Cobol, and C are prominently mentioned programming languages, but I could not find any mention of Lisp! Lisp is the dominant language in Artificial Intelligence! A serious oversight indeed.
The Xerox Palo Alto Research Center was a powerhouse of innovation in computer science and is mentioned in the book. However, Alan Kay, who is responsible for Graphical User Interfaces, Object Oriented Programming, and the laptop computer, was not included!
Some of the photos were irrelevant or trivial for their topic. There were some unimportant biographic details. Some major events and discoveries were not included such as the Turing Machine. That said, the book does go through most of the important computing discoveries and inventions in history. It's amazing how even though hardly anyone uses the abacus anymore, the principles from an object invented thousands of years ago are still impactful and will continue to be into the far future. It's incredible to see how most milestones happened after the Scientific Revolution and the how the pace accelerated over time. It was also interesting to learn that many modern devices are sleeker, more portable, and more powerful versions of devices invented in the mid to late 90s; the first tablet was invented in the 1960s.
There is a lot of information in The computer book. The hard cover edition of the book is a weighty book both because of its weight and the information inside. There are so many interesting things in this book.
Two of the things that I found interesting is that the first LED was invented in 1927. I never would have thought that LED's were invented in 1927. Wow.
And I also found the air force made a super computer with playstation 3 consoles interesting. Its pretty cool that you could build a super computer with playstation consoles.
There is a lot more information in the computer book than the 2 examples I mentioned. If you like computers I think you will probably enjoy reading the computer book and looking at the pictures.
I wish an Audible one was offered! This is a very good book describing the history, terms, authors, etc. Enjoy yourself! It's a good reference book for computer science classes.
Nice list of this & that, vaguely chronological, but each entry includes only a barebones & superficial glimpse of The topic ! ? It’s more of a personal family photo album of things, already understood, to reminisce about ?
This is the epitome of the coffee-table book, the Platonic ideal of the genre. Each topic gets exactly one page of text on the verso and a full-page color photo on the recto.
A brief visual compendium of hardware and software developments that has and continues to facilitate transcendence of human capacity to perform computation. A nice coffee table book for computer scientists and enthusiasts.
If you’ve ever wondered how we got from counting pebbles to training neural networks—this book lays out the entire journey, one milestone at a time. From ancient tools like the abacus, to the birth of binary, to Turing, to ENIAC, the PC revolution, the internet, and AI—it's all here, clearly explained and beautifully laid out.