Grammatical Man is the first book to tell the story of information theory, how it arose with the development of radar during WW2, and how it evolved. It describes how the laws and discoveries of information theory now support controversial revisions to Darwinian evolution, begin to unravel the mysteries of language, memory and dreams, and stimulate provocative ideas in psychology, philosophy, art, music, computers and even the structure of society.
Perhaps its most fascinating and unexpected surprise is the suggestion the order and complexity may be as natural as disorder and disorganization. Contrary to the entropy principle, which implies that order is the exception and confusion the rule, information theory asserts that order and sense can indeed prevail against disorder and nonsense. From the simplest forms of organic life to the words used to express our most complex ideas, from our genes to our dreams, from microcomputers to telecommunications, virtually everything around us follows simple rules of information. Life and the material world, like language, remain "grammatical." Grammatical man inhabits a grammatical universe.
An introduction to information theory and an attempted sketch of its applications, written by, unfortunately, a journalist. Shannon was still alive when this book was written and he provides a quote or two (so does Stanisław Ulam, for some reason), but his influence isn't nearly great enough to save it. The first part tries to explain information theory itself, which Campbell doesn't really understand, mainly through dubious word association involving the concept of entropy in thermodynamics, which is really not the same thing as information entropy—a lot of it is almost right, and probably a fair enough effort for a journalist. The following parts, where Campbell tries to piece together the implications for other fields, are all much worse: the same vague word association takes us from binary code to DNA, over a lot of nonsense about probability and Gödel's incompleteness theorem (which he at one point seems to mistake for Heisenberg's uncertainty principle?) to evolution, back to DNA, and finally through some version of Chomsky's bullshit to the workings of the brain and psychology, none of which he understands any better than Shannon's work. Here, too, some of it is almost right, but much more of it only looks superficially plausible and almost none of it is as related to information theory as Campbell thinks it is, or at the very least not in the way that he thinks it is.
Information theory admittedly looks a lot less esoteric these days, now that everyone knows what a bit is and we all have to deal with digital communication networks every damn day of our lives; the fact that this wasn't the case in 1982 explains some of the badness in this book, and some of the remainder is journalist syndrome, but the great bulk of it is due to Campbell fundamentally having a crank's brain.
I got this book for free (rescued it from a paper recycling bin) and I still feel I paid too much for it.
"We do not speak a language. Language speaks through us." Are we all just "mechanical animals, to quote Marilyn Manson? Is information, whether in language, computer code, or DNA what defines a living thing? "Sloth or entropy" is not our only destiny in life, pace Thomas Pynchon. GRAMMATICAL MAN makes a strong case for all humans really being cybernauts, even if we don't know it.
I have a hard time thinking of what to say about this book. Not because is boring or hard to read, but because it manages to precisely synthesise complex and confusing ideas that everyone has visited in two minutes of daydreaming. I can't believe this book is older than 30 years and I can't come up with reasons on why it isn't more quoted in popular science.
There aren't spoilers: information exists as an intrinsic part of reality and pervades any and all systems, both natural and artificial. Jeremy Campbell does its job exquisitely good: from simple beginnings and assumptions, concludes in amazing and complex consequences on how languages work, and how life drives itself and how complexity arises in the universe. And while doing it, it parallels the message he's conveying. It will definitely leave with you with an enhanced respect and awe for complexity in nature, man-made systems and everything that exists.
Da termodinâmica à computação; da física quântica à genética; da neurociência e dos sonhos à sociedade. Jeremy Campbell escreveu um livro que segue atual. Em uma excelente narrativa, ele evidência o caráter universal e unificador da “informação”.
I thought that the book was not argued very well. It was like, oh this person said this and that person said that. No support offered except that of the person referred to. It sytle seem like, da da da. Overall the book was okay because it did present some interesting ideas. But if your interested in a popular book covvering informtion theory, I would suggest James Glieck's The Information. It's more interesting and much better written.
This book was recommended to me by one of my favorite teachers ever: Kathleen Saunders. This book gets into information theory, the human brain, capacity for varying levels of consciousness and a lot of other mind-f*cking ideas. Some keywords you might learn...entropy, chaos, neotony...
This book may very well change your ways of perceiving all forms of communication. Based on the theories of Shannon, it demonstrates how our means of communicating are affected by the laws of thermodynamics (especial entropy). Reading this book made me radically rethink the way I worked and the ways in which I managed my communication with: upper management; managers of other areas of the corporation; my direct and indirect reports; affiliate companies; the medical community; and patients being treating with our medications. Understanding how our forms of communication contain some degree of entropy and how information degenerates into noise (entropy) led to major improvements as well very low rates of misunderstanding and confusion.
I dig it. The sentiment is at times datedly humanist, but Jeremy Campbell is one heck of a writer and this book touches on a number of topics that I find deeply interesting (probability, Information Theory, cybernetics, genetics and linguistics).
This one has a place on my shelf between James Gleick's The Information and Douglas Hofstadter's Gödel, Escher, Bach. I haven't started reading my copy Thomas Moynihan's Spinal Catastrophism but I suppose that would be a logical next step after finishing this.
I'm actually re-reading this book - the first time I read it years and years ago I thought it was totally inspired. His premise is that the universe is made up of three fundamental materials - Energy, matter and information. But I need to read this again to not make this sound like religious rubbish.
Probably outdated by now, but when I read it, it was all a revelation to me. My introduction to information theory. I imagine it might still be a good place to start.
It is clear that a 300-page book cannot cover the history of our universe, from the Big Bang to human evolution, human communication, and artificial intelligence, as Campbell tries to do here. Yet, an understanding of where we stand and where we are headed as a species requires an examination of the entirety of events leading to our emergence (“creation”). Campbell’s thesis is that information is at the root of everything, not just the obvious domains of genetics, languages, and brain functions, but also nature as a whole.
The science of information was born during World War II, directly affecting communications and computing, but also stimulating ideas in biology, linguistics, probability theory, psychology, sociology, philosophy, and art, that is, nearly everything! Processes found in nature do have random elements, but they mostly resemble the orderly formation of sentences in a language, yielding boundless structures from a finite collection of “words.” So, we don’t just have a grammatical man, as the book’s title suggests, but also a grammatical universe.
There’s a good deal of overlap between this book and Yuval Noah Harari’s Sapiens: A Brief History of Humankind, which I have reviewed before.
Campbell’s discussion of information theory also overlaps with the biopic A Mind at Play: How Claude Shannon Invented the Information Age, by Jimmy Soni and Rob Goodman.
The book is composed of four parts, whose titles and chapters are listed below, plus an afterword entitled “Aristotle and DNA,” which elaborates on how Aristotle’s view of the world was biological/informational, rather than mechanical, which makes him the sole classical thinker to have seen a glimmer of the theory of information.
Part 1. Establishing the Theory of Information (6 chapters): The second law and the Yellow Peril; The noise of heat; The demon deposed; A nest of subtleties and traps; Not too dull, not too exciting; The struggle against randomness.
Part 2. Nature as an Information Process (4 chapters): Arrows in all directions; Chemical word and chemical deed; Jumping the complexity barrier; Something rather subtle.
Part 3. Coding Language, Coding Life (5 chapters): Algorithms and evolution; Partly green till the day we die; No need for ancient astronauts; The clear and the noisy messages of language; A mirror of the mind.
Part 4. How the Brain Puts It All Together (6 chapters): The brain as cat on a hot tin roof and other fallacies; The strategies of seeing; The bottom and top of memory; The information of dreams; The left and right of knowing; The second-theorem society.
I enjoyed this. This was pleasantly coherent for science writing. He discusses information theory and thermodynamics, then goes into describing scientific theories in a range of fields as they relate to information. I would probably have found it more engaging if not for the fact that ~70% of the material was already familiar to me -- I skimmed some of the first and fourth sections.
How information entropy and thermodynamic entropy are related to each other has puzzled me for a bit, and his attempt is the best I've seen yet at bridging them, via the notion of probability and missing information (see also: ET Jaynes). For the later parts of the book, while it seemed like several patterns he draws on are incompatible (e.g. the striving for a universal grammar (i.e. a priori rules) vs using statistical patterns (i.e. a posteriori observations) vs the impossibility of a complete description in the first place vs the individuality of certain experiences), he tries to weave the thread of information theory through everything, as well as the themes of constraints vs freedom, entropy vs redundancy.
Overall I found the book valuable. It's a summary of (then) recent scientific thought across a broad range of fields, containing a several ideas that were new to me. Some that I found particularly interesting were David Layzer's cosmological argument for why is there order instead of disorder, neoteny in humans, the importance of second order genetic information, and Noam Chomsky's universal grammar. I would be interested in seeing how these ideas have evolved in the ~40 years since this book was written.
I read this fabulous book in college. It opened up a new world for me by tracing the emergence of information theory and the digital civilization it spawned.
It introduces the titans of the information revolution: Claude Shannon, Norbert Weiner, and John Von Neumann. These men created information theory, cybernetics and the modern computer. They will be remembered as descendants of the titans of the industrial age: Isaac Newton, Galileo Galilei and Ludwig Boltzmann, who ushered in modern energy science.
All information was seen as a system of signals. If the signals were too new, they could not be properly recognized and used. If they were too old, they could degenerate into useless noise.
Information was something other than energy. It had no physical dimension, but was still susceptible to entropy. The implications are enormous.
Just as we face the challenge of global warming from entropic heat, we now face the challenge of information smog from entropic noise. The math is the same.
Heat and noise are merging in today's world and humanity is now at risk as feedback bottlenecks prevent people from adapting in time to the pressures acting on the physical and mental environment that make up our modern human civilization. There is a danger of a system crash due to faulty signals in our media and state institutions. Keeping the balance between too many noisy signals both old and new will determine the survival of the human race.
Top ten reads of all time. I generally like books I choose, in anticipation of the topic, it is rare that I dislike something I research and know will like on some level. This book however, came to me on a whim browsing through Stands Books in New York. I came across it in the linguistics section and it looked interesting immediately. Jeremy Campbell writes exceptionally well on topics that are difficult to integrate. He posits that Information Theory is all around is in everything, and he backs it up in a seamless foray into genetics, cryptography, linguistics, and psychology. Its immensely entertaining, and left me wondering why this book didn't get a Pulitzer Prize like Godel, Escher, Bach: An Eternal Golden Braid. He certainly contributed to the world of thought and I would say if this book were more widely read, and its contents taught classically, the construction of the world itself, could have looked very differently if we thought about Information Theory more often.
In summation, it is a delightful read, and I wish it were three times as long! Bravo Jeremy Campbell, I will forever be thinking about the world in a different way, thanks to your superb writing, and mind, that can put together seemingly unrelated, but naturally connected, topics like these.
This is amen immensely readable book that connects scientific research and theory on evolution and genetics about the origin of life to the origin of language. The forces that propel the atoms of matter that constitute all life are internally evolved nor by some external force , but from a natural, self determined evolution that is bound by certain innate rules that do not bind or limit human development but enable great flexibility in surprising although constrained ways . The author provides lucid detail about historical thought proving how much Sense and order prevail in the universe against nonsense and chaos to advance to increasingly more complex structures of physical and linguistic development. Concepts such as how the code for DNA
Popular science meets information theory in the hands of a journalist. Written in 1982, this hasn't aged well. It's not just that it predates the web by almost a decade, but it is also a little too swayed by then-fashionable arguments about evolution. Not only that but, after the sober stuff on communication, coding, entropy, DNA, it runs out of steam: chapters on human cognitive psychology can't really carry the infomration theory theme. Nevertheless, I still enjoyed this book when I first read it, when it was first published. But now, it is superseded by Gleik's "The Information".
Excellent read! Coming in with a knowledge of information theory from both the coding theory perspective and the linguistics perspective, this book felt super fun to go through. It does an excellent job of laying the foundation for how and why info theory can show up in so many places like an easter egg.
I'll be recommending this one to my friends who want to know why I think information theory is cool.
Like the previous reviewers said, the author did a fantastic job by using information theory to string biology, linguistics, brain, human society and so on. In an era when disciplines are becoming more and more specialized, a broad perspective can enable readers to view the world differently.
I've finished reading this book, but I am quite sure I will read it again. Here is a book that teaches you about thermodynamics, information theory, syntactic theory, DNA structure, psychology, and much more. I highly recommend it.
Back in the day 1970's, contemporary deep dive into Information Theory. People were just learning DNA sequences and exploring artificial intelligence. Much has changed since the 70's, comparing where we are now within 5 decades is astonishing.
"At intervals, Wiener would walk into Fano's room, puffing at a cigar, and say: "Information is entropy." Then he would turn around and walk out again without another word."
Jeremey Campbell’s Grammatical Man is a wonderful, ambitious book. It is well written. The prose is a delight. But what makes it really stand out is the breadth of its focus. Campbell's premise is that information theory informs a very wide range of topics.
To be efficient an information system has to balance rules with originality. The rules are necessary to create some order in a disorderly world. Without rules the message will be overwhelmed by the noise. But if the rules are too rigid and overarching they limit what can be conveyed. Successful information systems can be seen as dynamic systems with distinctly chaotic behavior. Campbell argues that this basic framework is helpful for understanding a very broad range of topics.
Campbell starts by noting that successful information systems can be thought of as processes that decrease entropy. A primary example of this the role of DNA, as an information system, in the evolution of life. Campbell notes that a simplistic version of the dynamics of Darwinian evolution – random genetic changes subjected to survival of the fittest – cannot really account for the actual pattern of evolution. Species do not really evolve smoothly in the geologic record. New species tend to begin and end in a much more discrete way. Modern genetics has come to recognize that something more complicated must be driving the process. Understanding DNA as an information system – the balance between reliable replication and structure that accommodates constructive innovation that generates chaotic dynamics – helps to explain the actual record of evolution.
Campbell also focuses on language, particularly Chompsky’s concepts of deep structure and universal grammar. Campbell argues that those concepts can be understood as examples of structure in an efficient information system. But he also argues that they are very deep-seated. Children learn language so rapidly that the deep structure of language probably is not learned. Rather, it appears to be coded into our brains.
The final part of Grammatical Man focuses on other aspects of the brain, in particular sight and memory. Campbell notes that human sight and memory are not simple information processes. They work by imposing structure on the information they receive. Optical illusions occur when that structure in pattern recognition fails. Similarly, the brain tries to fit incoming information into a pre-existing frameworks/narratives. As a result our memory has certain predictable biases. For example, we tend to remember information that fits together rather than unconnected facts. That means we tend to impose order on information that doesn’t exist.
I'm not an expert in these fields. But I found Campbell’s arguments compelling and deeply thought provoking.
Moreover, it was a joy to read. I highly recommend it.
The mapping between the symbolic and the substantial is not always a linear one—a constituent element of both cosmological and sentient reality is that the link is often one of information about information. Entropic and biological evolution inherently unfold towards disorder and complexity, respectively configured according to how information is conjoined with underlying laws and the paradoxical juxtaposition of the statistical at macro levels with the (unattainably) factual when we tunnel down into the micro.
Stunning book with an incredible breadth from biological systems such as DNA to cognitive systems such as the brain and mind; from thermodynamical systems to the complex societies we inhabit. This book talks about how information theory is revolutionising and connecting fields as disparate as linguistics and complexity science as well as computational theory. Talks about how redundancy is essential for intelligibility in all fields and how information and codes are the basis for anything that is intelligible and has a message.
Interesting and new (to me) ideas about information theory and how it applies to evolution, genetics, and language. It is not overly technical, which is both a plus and a minus: It is easy to read and isn't overwhelming for a layperson, but it lacks depth and clarity in some areas.