The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.
This book develops Information Theory slightly further than Claude Shannon did. Although Khinchin praises Shannon for going and producing the ideas of Information Theory by himself, he acknowledges that the cases presented by Shannon were rather limited in scope to simplify the solutions.
The book as a whole is divided into two major sections; the first is called The Entropy Concept in Probability Theory and the second is called On The Fundamental Theorems of Information Theory. Both of these were originally papers printed by academic journals in the Russian Language.
The book was interesting, but I did pick out another short one, this book was only 120 pages long.
I knew it was time to go back to school when I was so desperate for some kind of analytical thought that I found this in a used bookstore in Somerville and read it cover to cover.