Jump to ratings and reviews
Rate this book

Entropy and Information Theory

Rate this book
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

332 pages, Hardcover

First published September 4, 1990

1 person is currently reading
47 people want to read

About the author

Robert M. Gray

18 books5 followers
Robert M. Gray, of Stanford University.

See also:
Robert M. Gray Jr., editor of Technology Integration in Higher Education.
Robert M. Gray, author of The Church and the Older Person.

Robert M. Gray was born in San Diego, Calif., on November 1, 1943. He received the B.S. and M.S. degrees from M.I.T. in 1966 and the Ph.D. degree from U.S.C. in 1969, all in Electrical Engineering. Since 1969 he has been with Stanford University, where he is currently the Lucent Technologies Professor of Engineering, Emeritus, and Professor of Electrical Engineering, Emeritus. His research interests are in information theory and signal processing, especially in the theory and practice of quantization, compression, and classification.
He was a member of the Board of Governors of the IEEE Information Theory Group (1974-1980, 1985-1988) and of the IEEE Signal Processing Society (1998-2001). He was Associate Editor for Source Coding (1977-1980) and Editor-in-Chief (1980-1983) of the IEEE Transactions on Information Theory. He is currently the Editor-in-Chief of Foundations and Trends in Signal Processing. He was Co-Chair of the 1993 IEEE International Symposium on Information Theory and Program Co-Chair of the 1997 and 2004 IEEE International Conference on Image Processing.

He is a Fellow of the Institute of Mathematical Statistics and the IEEE and has held fellowships from the Japan Society for the Promotion of Science at the University of Osaka (1981), the John Simon Guggenheim Foundation at the University of Paris XI (1982), and NATO/Consiglio Nazionale delle Ricerche at the University of Naples (1990). During spring 1995 he was a Vinton Hayes Visiting Scholar at the Division of Applied Sciences of Harvard University. He is a Faculty Affiliate of the Clayman Institute for Gender Studies at Stanford University, where he was a Faculty Research Fellow during the 2008-2009 academic year.

He was corecipient of the 1976 IEEE Information Theory Group Paper Award and the 1983 IEEE ASSP Senior Award. He was awarded an IEEE Centennial medal (1984) and an IEEE Third Millennium Medal (2000). He received the 1993 Society Award, the 1998 Technical Achievement Award, the 2005 Meritorious Service Award, and the 2009 Education Award from the IEEE Signal Processing Society. He received a Golden Jubilee Award for Technological Innovation (1998) and the 2008 Shannon Award from the IEEE Information Theory Society. He received the 2008 IEEE Jack S. Kilby Signal Processing Medal. He received a 2002 Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM) and the 2003 Distinguished Alumni in Academia Award from the University of Southern California. He is a member of the National Academy of Engineering (2007). He holds an Advanced Class Amateur Radio License (KB6XQ).

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
3 (42%)
4 stars
1 (14%)
3 stars
3 (42%)
2 stars
0 (0%)
1 star
0 (0%)
Displaying 1 of 1 review
50 reviews
dnf
December 31, 2024
Tried to find a book on info theory that focuses on intuition and applications. This is highly formal, founding info theory in mathematical notation used for physics. I haven't touched that since I was an undergrad. This book may be appealing to graduate students in math, but I would not recommend for practitioners.
Displaying 1 of 1 review

Can't find what you're looking for?

Get help and learn more about the design.