Excellent introduction treats 3 major analysis of channel models and proof of coding theorems; study of specific coding systems; and study of statistical properties of information sources. Appendix summarizes Hilbert space background and results from the theory of stochastic processes. Advanced undergraduate to graduate level. Bibliography.
Information theory starts in Bell labs trying to figure out how to communicate through a noisy channel and developed into a branch of engineering that involves communication technology and computation namely what goes into the internet these days. I like that Dover has old reprints that are cheap. Unfortunately, the style and pedagogy are usually subpar and makes for a slog to read. I find a textbook on a new topic is usually bad because people who know how to communicate the topic have to have time to figure out the finer points that trip up a student and make the communication and the metaphors to get students to grasp the topic take time to develop. Dover by reprinting old textbooks on a science and engineering book means the pedagogy is bad and the book is a slog. If you are willing to spend a little more a newer textbook in a mature subject communicates more effectively. I will leave a video which gives a taste of the subject so that you get more than my complaints.
Information theory, like game theory, is one of those fields of mathematics that has become interesting to people who have no idea what mathematics are but want to lend some sort of scientific weight to their predetermined social theories. Thankfully, those types of people won’t make it 10 pages into this book.
This is a mathematics book. Robert B. Ash, the late professor at University of Illinois, sets forth the idea of his book in the preface:
”I think it is fair to say that this book is concerned entirely with the Shannon formulation, that is, the body of mathematics knowledge which has its origins in Shannon’s fundamental paper of 1948. This is what ‘Information Theory’ will mean for us here.”
Here, there are no wordy definitions, no philosophical proposals, no “theoretical” implications about what information even is, man, or how it affects society. This is a textbook full of mathematical proofs and illuminations that Shannon left implied in his original paper A Mathematical Theory of Communication (1948). It is a commentary and explanation of Shannon’s work with practice problems and solutions - essentially what Robert Ash was teaching his graduate mathematics students at the University of Illinois when this book was published in 1965.
Between 1965 and now, Information Theory has proved to be an indispensable field of modern life. We have made such astonishing advances on Shannon’s original work that it must seem any book from the days before the internet must be hopelessly outdated. That is not the case here. By concentrating entirely on the mathematics, Robert Ash has provided a supplementary textbook to any serious student of informatics.
Ash mentions early on that he tried to keep the pre-requisites to a minimum and that his book should be comprehensible to first-year graduate students of mathematics. This is probably accurate. The mathematics are certainly outside of what an average undergraduate student in the hard sciences learns and far above anything the student of the social sciences ever studies. A working knowledge of probability theory is absolutely essential, but there is enough material on Hilbert spaces in the appendix to follow some of the work of the later chapters.
In short, this book is: - Not for people who just want to pad their social theories with real science - Worthwhile for people who want to get a basic understanding of Shannon’s work and are willing to patiently work through the mathematics (me) - Supplementary for graduate students of computer science and informatics
Personalmente non mi è piaciuto molto l'approccio di Ash alla teoria dell'informazione: mi è sembrato troppo legato all'analisi matematica e quindi si perde il significato pratico della misura dell'informazione. Ciò detto, il testo è comunque apprezzabile per avere una panoramica piuttosto completa della teoria dell'informazione di base, compresa la parte sui canali con memoria (e quindi sulle catene di Markov) e sui segnali continui, oltre che un approfondimento sui codici a currezoine di errore.
Quite fascinating, but I don't really know much about probability and that would have helped out a great deal. I have a book on probability, so perhaps I will read that next.
This book expands on the Information Theory proposed by Claude Shannon in 1948 with his Mathematical Theory of Information. Covers some proofs and goes into further depth than Shannon did with it. Contains problems and solutions in the back, along with an appendix that explains some stuff about Hilbert spaces.