Status Updates From Elements of Information Theory

Elements of Information Theory Elements of Information Theory
by


Status Updates Showing 1-30 of 81

order by

M
M is on page 254 of 784
In the middle of reading about differential entropy. - 22:22, 04.01.2026.
Jan 04, 2026 01:22PM Add a comment
Elements of Information Theory

M
M is on page 109 of 784
Skipped the problems because reading on a bus is maybe not the best environment for deep thought. So instead started the next chapter on data compression and got till Kraft inequality. I remember learning about it as a side remark in my introductory TCS class in undergrad. Didn't remember the proof being so clean.
Sep 16, 2025 10:17PM Add a comment
Elements of Information Theory

M
M is on page 88 of 784
Finished Chapter on the entropy rates of stochastic processes. Very fun and light intro, though I am kinda struggling what I am missing with the first exercise.
Sep 10, 2025 07:20PM Add a comment
Elements of Information Theory

M
M is on page 78 of 784
Pretty good. The AEP chapter was nice and short. I like the book motivated the entropy rate of a stationary stochastic process too.
Sep 04, 2025 10:04PM Add a comment
Elements of Information Theory

M
M is on page 43 of 784
Finished Chapter 1 till summary. Was good so far, though I don't have a good intuition for some equalities, such as the chain rule for mutual information. - 23:16, 18.08.2025
Aug 18, 2025 08:16PM Add a comment
Elements of Information Theory

M
M is on page 38 of 784
Got stuck in the proof of Theorem 2.10.1 (Fano's inequality). I don't have a good feeling how the probability Pr(X \neq \tilde{X}) is random, and the inequalities in 2.135 don't make sense to me yet.
Aug 11, 2025 08:08PM Add a comment
Elements of Information Theory

M
M is on page 30 of 784
Read up till Section 2.7.

Finally got a feeling for what mutual information and the KL divergence is. I especially liked the interpretation for encoding a random variable X. The KL divergence is the inefficiency, the additional needed when one uses an encoding assuming X follows q(x) but actually the true probability mass function is p(x).

The Venn-Diagram with H(X | Y), H(Y | X), I(X;Y) was helpful too.
Aug 10, 2025 08:21PM Add a comment
Elements of Information Theory

Duc Nguyen
Duc Nguyen is on page 268 of 784
May 19, 2025 10:34AM Add a comment
Elements of Information Theory

Duc Nguyen
Duc Nguyen is on page 208 of 784
May 18, 2025 07:19AM Add a comment
Elements of Information Theory

Duc Nguyen
Duc Nguyen is on page 184 of 784
May 16, 2025 07:48AM Add a comment
Elements of Information Theory

Duc Nguyen
Duc Nguyen is on page 82 of 784
May 09, 2025 07:55AM Add a comment
Elements of Information Theory

Duc Nguyen
Duc Nguyen is on page 38 of 784
May 08, 2025 03:13AM Add a comment
Elements of Information Theory

Dex H
Dex H is on page 28 of 784
Aug 17, 2023 04:04AM Add a comment
Elements of Information Theory

Dawith
Dawith is on page 75 of 784
Dec 08, 2022 09:12PM Add a comment
Elements of Information Theory

Tra Ngo
Tra Ngo is on page 9 of 784
Dec 01, 2021 08:53PM Add a comment
Elements of Information Theory

Tra Ngo
Tra Ngo is on page 5 of 784
Aug 29, 2021 04:36PM Add a comment
Elements of Information Theory

Tra Ngo
Tra Ngo is on page 5 of 784
Aug 29, 2021 04:36PM Add a comment
Elements of Information Theory

Adam
Adam is on page 71 of 784
Dec 14, 2020 06:59PM Add a comment
Elements of Information Theory

« previous 1 3