Jump to ratings and reviews
Rate this book

Electronic Life: How to Think About Computers

Rate this book
Explains the fundamental facts about computers, providing a comprehensive guide for the layman who needs to know about and wants to enjoy the home computer revolution

266 pages, Paperback

First published August 12, 1983

3 people are currently reading
1038 people want to read

About the author

Michael Crichton

222 books20.3k followers
John Michael Crichton was an American author, screenwriter, and filmmaker whose prolific career left an indelible mark on popular culture and speculative fiction. Raised on Long Island, he displayed a precocious talent for writing, publishing an article in The New York Times at sixteen. Initially enrolling at Harvard as an English major, he switched to biological anthropology after discovering a preference for scientific study over literature. He graduated summa cum laude and received a fellowship to lecture in anthropology at Cambridge. Later attending Harvard Medical School, he earned his MD but chose not to practice, dedicating himself to writing instead. His medical background profoundly influenced his novels, providing authentic scientific and technical underpinnings that became a hallmark of his work. Crichton began writing under pseudonyms, producing suspenseful thrillers as John Lange, including Odds On, Scratch One, and Easy Go, and as Jeffrey Hudson with A Case of Need, earning him an Edgar Award. His first major success under his own name, The Andromeda Strain, established his signature blend of scientific authenticity, tension, and exploration of technological hazards, leading to its film adaptation. Over his career, he wrote 25 novels, including The Terminal Man, The Great Train Robbery, Congo, Sphere, Jurassic Park, Rising Sun, Disclosure, The Lost World, Airframe, Timeline, Prey, State of Fear, and Next, several adapted into major films, with four additional works published posthumously. Crichton also made significant contributions to film and television. He wrote and directed Westworld, pioneering the use of 2D computer-generated imagery, and later directed Coma, The First Great Train Robbery, Looker, and Runaway. He created the influential medical drama ER, which he executive produced and developed with Steven Spielberg, achieving critical and commercial success. Many of his novels, most famously Jurassic Park and its sequel The Lost World, became cultural phenomena, combining imaginative adventure with grounded scientific speculation, often exploring humanity’s overreach in genetics, biotechnology, and complex systems. His literary style was notable for integrating meticulous scientific detail, suspense, and moral cautionary themes. His works frequently addressed the failure of complex systems—biological, technological, or organizational—demonstrating the unpredictable consequences of human hubris. Employing techniques such as first-person narratives, false documents, fictionalized scientific reports, and assembling expert teams to tackle crises, Crichton created immersive stories appealing to both popular and scholarly audiences. His exploration of genetics, paleontology, nanotechnology, and artificial intelligence revealed both fascination and caution about humanity’s technological ambitions, while his early non-fiction, such as Five Patients and Electronic Life, reflected his scientific insight and forward-thinking approach to computers and programming. Standing 6 feet 9 inches tall, Crichton experienced social isolation in adolescence and later pursued meditation and consultations with psychics, cultivating a lifelong interest in human consciousness and alternative experiences. A workaholic, he approached writing with disciplined ritualistic methodology, often retreating entirely to complete a novel in six or seven weeks. He was married five times, fathered two children, and maintained a wide-ranging collection of 20th-century American art. Crichton engaged in political and scientific discourse, particularly regarding global warming, where he was an outspoken skeptic and testified before the U.S. Senate. He contributed significantly to the discussion of intellectual property, technology, and environmental policy, coining concepts such as the Gell-Mann amnesia effect. Throughout his life, he received numerous awards, including Edgar Awards, a Peabody Award for ER, an Aca

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
28 (15%)
4 stars
48 (25%)
3 stars
67 (36%)
2 stars
34 (18%)
1 star
8 (4%)
Displaying 1 - 17 of 17 reviews
Profile Image for Drew Weatherton.
200 reviews3 followers
March 17, 2016
I honestly only read this book as part of my goal to read every Michael Crichton book. I was really surprised by how much I enjoyed it. It's intended to be a new computer user's introduction to computers (often general enough to still be valid today, but sometimes focusing on specific hardware like the Apple II) in the early 80's. There is a brief introduction and then alphabetic sections on various topics. I found many of these to read like brief editorials about how computers have and will change our lives. Many of Crichton's predictions were valid. I definitely recommend this to any Crichton fan who has an interest in computer history. I also read a companion short story by Crichton "Mousetrap: A Tale of Computer Crime" which was only (to my knowledge) published in Life Magazine (January 1984, Vol. 7, No.1, pp. 116-126). I enjoyed the story and it had ties to the "Computer Crime" section of Electronic Life. Ironically, I paid a lot more for the short story ($9) than the hard copy of Electronic Life ($4).
Profile Image for Jerry.
Author 11 books28 followers
December 15, 2020

What is truly new does not create shock—it creates nothing. If we are shocked by art, we are shocked because our expectations are not met. And that means we already have expectations based on previous experience.


When I first saw this on a shelf at Dunaway’s, I expected it to be hilariously out of date and just plain wrong even for its time. Michael Crichton writing about “How to think about computers” in 1982?

When I started browsing it, however, it turned out to be interesting, useful, and even useful despite its age. While there is some advice about floppy disks and magnetism that no longer matter, most of the advice is much more general.

The advice ranges from the specific, such as always keep backups to the philosophical, such as:


Render unto the machine what is the machine’s. You go do what only a human being can do.


And he also touches on the form vs. function debate. On the one hand,


The whole idea of machines is to do the job.


But on the other,


Good design is not an aesthetic frill; it matters. A pleasing appearance means somebody cared how it looks to you; it’s a strong hint that the inner workings have been arranged with the user in mind, too.


Even when he’s wrong, he’s right. I noticed long-ago that while special-purpose computer tools might be the right choice right now, it is never a good idea to bet on them in the long term. In 1982, for example, if you needed a computer only for word processing, it might have made sense to buy a dedicated word processor instead of a general-purpose computer.

But if you were going to choose to invest in either a computer company or a typewriter company, the smart bet is that general purpose computers will eventually drive out the special-purpose computer. The reason is that the general-purpose computer does more things. It means that you’re going to have the general-purpose computer eventually anyway; and it will perform functions tangential to word-processing that end up being essential to word-processing, such as browsing the web, printing to a completely new style of printer (such as PDFs), and in general surpassing the dedicated word processor even at specifically word-processing tasks.

This has happened not just with word processors but with video editors, book readers, even, I suspect, television sets. Crichton predicts, however, that “The trend toward specialized manufacture at the level of the silicon chip is bound to increase the range and availability of special machines.”

Which was true; computers are everywhere. And yet:


Indeed, it may turn out that the general-purpose computer is a creation unique to the late twentieth century. It may soon be practical to have one computer for doing graphics, another for doing word processing, another to link with our interactive cable TV, and so on.*


What he didn’t see was that the processes that made specialization possible, while they would put computers everywhere, would also make them powerful enough to handle more and more tasks on their own.

Except that he did see that as a possibility. That asterisk at the end of the quote led to a footnote:


*There is a counter-argument, based on the history of digital watches and calculators. Those devices have shown a clear trend toward incorporating more and more functions into a single device.


So that now most of us carry a general-purpose computer to handle everything from picture-taking to calculating to telephone calls.

There’s a similar bit when he talks about CP/M and Unix, the two major machine-independent operating systems.


If your machine can run under CP/M, then you have access to [an enormous range of programs]; if not, you don’t, and never will… Unix, another operating system… is gaining popularity. The principal advantage of Unix is that it is extremely portable—programs developed on one machine are easily converted to another. Its immediate usefulness, however, is not as clear as that of CP/M.


It seems pretty clear that he hasn’t used Unix and is reporting what makes sense to him of what he’s heard about it. But that he even mentions it in 1982 is pretty impressive.

The back of the book contains a handful of BASIC programs, running through the very basics of how to use an Apple II or an IBM PC to save and write BASIC programs. The programs are the same in each case, the steps nearly the same, merely translated from the Apple II to the IBM PC disk operating systems.

He chooses the programs to show off how a computer can look creative when in fact it is only doing exactly what the programmer has told it to do. In this case, randomly choose plot points for a stereotypical mystery story, or make predictions from the I-Ching, or produce random profundities in response to questions.

The programs are meant to improve the reader’s ability to recognize that what computers produce is no more to be trusted than what humans produce—because it’s the same thing. Sadly, that’s the most out-of-date part of the book. Not because the lesson isn’t important, but because convincing people to experiment with programming would be much harder to do nowadays.


Computers are information-processing, communicating devices. And if they set a new standard for information processing and communication by human beings among themselves—well, we’ve needed that for a long time.
Profile Image for itchy.
2,971 reviews34 followers
June 23, 2020
titular sentence:
p39: The only solution is to accept instantaneous dissemination as a fact of electronic life.

ocr:
p62: In mathematical terms they use binary notation, a series of l's and 0's.

p62: But machines don't understand WRITE or PRINT; they only understand l's and 0's.

92: WHAT MAKES YOU THINK I AMENT A HUMAN BEING KIDDO

Something seems missing from my copy. Michael was hinting at a computer code at included at the end of the book but I can't seem to find it. Or maybe I didn't recognize it for what it was, ha.
Profile Image for Kyriakos Sorokkou.
Author 6 books213 followers
Read
September 21, 2020




Διαβάστε και την ελληνική κριτική στις βιβλιοαλχημείες.

I don't recommend this book if you are not a Crichton fan.
I don't recommend this book if you haven't read any Crichton.
I don't recommend this book if you are not a fan of science nonfiction.
I don't recommend this book if you are not into computers and programming.

In other words it's only for those few people who are Michael Crichton fans and completists like me. People who are roaming the streets and not occupying a cell in an asylum.

This book along with two more by Crichton:
(Dealing, or The Berkeley-To-Boston Forty-Brick Lost-Bag Blues
Jasper Johns)
is out of print and pretty rare so its hard to find and a bit expensive.

But I managed to find all three with the exception of Jasper Johns.

Electronic Life was written in the early 80's and it is in a way a guide for the uninitiated to the computers. Computers that now belong to the Science Museum. Computers from the 80's that had less memory than a normal smartphone.

It also talks about floppy disks the size of a paperback and computers before the visual interface.
Anything you wanted to do was done by typing commands, not by searching for the folder's icon.

But the reasoning (of how to choose a new product), the ideas, and the humour of Crichton's writing are certainly not dated.
And that's what I enjoyed more from this book.
And the fact that it know belongs to my collection.
Profile Image for r.
174 reviews24 followers
June 18, 2020
"Personally I hope that, for once in the twentieth century, a new technology will stay free. Because the rule-makers always manage to kill the essence while tidying up the details. Dogma replaces direct experience, and ritual becomes reality.

Right now, we are at the edge of a new era of unlimited potential. Nobody can see what is going to happen; nobody knows.

There are no experts, and there's no reason slavishly to follow anybody's instructions about anything.
Including mine.

Do it your way, and have all the fun you can."


"Good design is not an aesthetic frill; it matters. A pleasing appearance means somebody cared how it looks to you; it's a strong hint that the inner workings have been arranged with the user in mind, too."


"Actually, illiteracy was never a good idea unless you aspired to be a slave. Knowledge is power, and knowledge exists as manipulated symbols. That's been true for several thousand years. It isn't going to change anytime soon."


"There are many ways of looking at reality. Science is only one way. The most profound experiences of human life lie beyond science—and beyond computers. This does not diminish either science or computers.
It's just a perspective."
Profile Image for Kaare Kvenild.
26 reviews
July 9, 2014
I picked this up used because Crichton is my all-time favorite author. It was fun to read because it really picks up on a lot of fears people had about computers back in the day (as in back in 1983). Some really, unintentionally, funny stuff.
202 reviews9 followers
June 21, 2024
When I recently discovered that Michael Crichton had written a nonfiction book about computers back in 1983, I wasted no time finding it. Perhaps this would be much like another book I have: "The Essential Guide to Home Computers" by "Dune" author Frank Herbert from 1980. I still recall paging through and buying Herbert's book as a teen at the mall, and I reviewed it here a couple years ago. But I don't think I was really aware of Crichton until the "Jurassic Park" movie came out, or maybe when the novel made the best-seller lists.

This was an interesting time in history: famous people like Crichton were writing guides explaining to people what these newfangled computers were all about and whether we should be afraid of them. The closest modern analogy I can think of today are books about AI. For personal context, this book came out about the time I was taking my high school's inaugural Computer Science class, and teaching myself BASIC on my Commodore VIC-20. But I wasn't as much of a geek or as good a programmer as my best friend, who had an acoustic-coupler modem like Matthew Broderick had in "Wargames" and was dialing into computer bulletin board services, BBS's. (As a hint of the Zeitgeist, "Wargames" was also released in 1983, the year this book was published.)

By this point in time the multi-talented Crichton had taken time off from his job as a Harvard-trained OR Physician to write a handful of novels, including "The Andromeda Strain" and "The Great Train Robbery", AND had directed movies, including the movie adaptation of his "Great Train Robbery", starring Sean Connery. So he was well-known, just not by me in my callow youth. (In total, physician-novelist Crichton directed 8 movies and a video game!)

And he had become an early-adopter computer user, noting in the introduction that this book had grown out of notes he'd been making to help friends use their new computers. He mentions that he himself was writing software for the film industry and promoting its adoption of minicomputers. The late Michael Crichton, in case it's been forgotten by now, was famously smart.

To the book. Its starts off with a very quaint intro to computers, with such suggestions as taking time to figure out which part is the computer and which part is the keyboard; and then sticking on 3x5 cards to label the parts until you have them committed to memory.

An odd thing about the book is that it's not organized by chapters. Instead it's a compilation of brief sections, running from a half page to three pages in length. It seemed to me they were just randomly ordered. Then I realized the sections are in alphabetical order by the section NAME: for example, one sequence is "Eyestrain", "Fake", "File", "Games". Which, yes, effectively means the topic sequence is random.

And so it's really just a bunch of separate musings by Crichton. The subtitle of the book is "How to Think About Computers", and sure enough it tends more philosophical than practical. The section "Courseware" criticizes the use of jargon such as "courseware". The section "Scientific Models" tells an anecdote about two surgeons: a precise, by-the-book one one who got poor results, and an imprecise and seemingly inattentive one who got good results -- concluding that some things just aren't explicable by science.

Not that he ignores practicalities entirely: he urges you, for instance, not to buy a monitor that lacks an on/off light.

Finally he gives appendices for getting started using the Apple II and the IBM PC, with sample BASIC programs, which I of course programmed and ran on my retro DOS machine in 1980s GW-BASIC.

As for his vision of the future, he wasn't bad. He didn't foresee the Internet but he did have an appreciation for the usefulness of remotely network computers and BBS's. He didn't anticipate the iPhone but he did know "portable computers" were becoming smaller and lighter and would come to be "book-sized". He also foresaw it would be common for people to work from home with computers -- just like I do now.

I guess the fairest way to judge this book is, would it have been useful in 1983? Would I, as a high school kid sitting at my bedroom desk with my VIC-20, have benefitted much from this? Well, no. What would have benefitted me more was the sort of computer user magazines they had back then -- with type-in BASIC programs, primers on machine language, articles about programs and games, advertisements that opened your eyes to products on the market. Airy philosophizing about the nature of future copyright laws would not have meant much to me.

So it's not the book I needed in 1983 but from this vantage point, the most interesting thing about the book is learning about Crichton and his endeavors at the time.

A decade later, come 1993, Crichton was most famous as the author/brains behind the Steven Spielberg movie "Jurassic Park" -- which of course had revolutionary CGI -- but by that time he himself had already directed an iconic CGI-enhanced movie 20 years earlier: for his second director credit, Crichton wrote and directed the 1973 "Westworld", the original IP for the recent TV series all these years later. And "Westworld" was, according to him, the very first movie to use CGI -- though he doesn't use that term yet, he calls it the first feature film to "process imagery by computer" -- because Crichton himself had a vision that THAT was what was needed for a film about rampaging robots. (And I do recall watching Westworld as a kid when it made it to TV at that time: My first introduction to Crichton, even if I didn't pay much attention to the director/writer credits as a 7- or 8-year-old.)

[As a fitting aside, Steven Spielberg's father had led the team that developed General Electric's first commercial computer in the late 1950s, a model which was then used in the 1960s at Dartmouth to invent the BASIC programming language, the language of Crichton's samples in these appendices from the 1980s -- which I've typed in and run here in the 2020s!]

So in the end, not a book for everyone but a bit of nostalgia for a Gen-X computer nerd, and maybe something unexpected for the Crichton fan and completionist.
Profile Image for Alex Shrugged.
2,769 reviews30 followers
June 14, 2020
This is an excellent book for 1982. For today (2020) it's still interesting. It is written like a 1 volume encyclopedia of computer subjects. He writes a short essay on each subject, most of them no more than a couple of paragraphs but the longer essays are the ones to watch for. Those still apply today. You don't need to learn how floppy disks work any more. You do need to know generally how to run a software project when you don't know how to program.

He includes some information on how he did Westworld special effects, and he explains (briefly) how the biowarfare maps and infection mapping was done for his book, "The Andromeda Strain". He also writes an essay on Artificial Intelligence.

If I were to read this book again, I'd skip to the longer essays and read only them. Despite the dated material, it was still a good read.
27 reviews3 followers
March 20, 2018
I hunted down this book to complete my Collection of Crichton. Needless to say I was impressed with its contents. So many references to his other works like Westworld and Five Patients and what would be mentioned later in Travels. He even wrote a couple programs that were easy to understand the functionality of (located in the appendix). Also I like the tid bit about how he got a D on a paper in Biostatistics (pg 95). Looking back on learning how to use the windows cmd line and launching programs via large floppy disks, the way we use computers hasn’t quite changed. I mean it has but it hasn’t. Crichton actually can explain the difference of the meaning ‘change’ itself in the beginning of the book far better than I can here. But anyway what I’m getting at is, read this and if you’re old enough, remember the whirr and beep of the 80s and 90s and never forget that now you don’t have to kick your sibling off the Internet so you can place a call to your sweetheart.
Profile Image for Pranay Kothapalli.
14 reviews1 follower
October 25, 2018
A decade was left for me to be born when this book was published. As someone who's been in love with computers since childhood, I could relate to a lot of facts and fiction, fiction that turned out to be facts. The part where people were scared of computers, it predominantly exists even today. Everyday we talk about computers taking over us one day thanks to so many science fiction movies. Computers are what we tell them to be. That's what the whole book was about, to educate fools about computers.
This entire review has been hidden because of spoilers.
1,422 reviews8 followers
September 4, 2023
This provides a great time-capsule in the form of Michael Crichton's thoughts an recemmendations about computers in 1983. Some of his predictions were more correct than others, but I'm impress with how many of his general thoughts on the subject remain true to this day. Not surprisingly, he uses a style that keep it from being too technical and keeps it interesting.
Profile Image for Rick.
665 reviews4 followers
June 6, 2021
Interesting, wish I had read it 35 yeas ago.
Profile Image for Harrison.
Author 4 books68 followers
June 22, 2015
If it was 1982, I think this book would be very helpful. It's interesting to read now to see what Crichton predicted correctly and what he got wrong. His anecdotes and metaphors are interesting and well-conceived. "Electronic Life" shows just how ahead of the curve he was.
Profile Image for Don.
178 reviews8 followers
November 16, 2014
Interesting to see someone's vision from 31 years of what life with computers would be like ... and what computing was like 31 years ago ...
Displaying 1 - 17 of 17 reviews

Can't find what you're looking for?

Get help and learn more about the design.