Jump to ratings and reviews
Rate this book

Farewell To Entropy, A: Statistical Thermodynamics Based On Information

Rate this book
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term “entropy” with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the “driving force” of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy. It has been 140 years since Clausius coined the term “entropy”; almost 50 years since Shannon developed the mathematical theory of “information” — subsequently renamed “entropy.” In this book, the author advocates replacing “entropy” by “information,” a term that has become widely used in many branches of science. The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term “entropy.” The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the “driving force” for which is analyzed in terms of information.

411 pages, Kindle Edition

First published January 18, 2007

10 people are currently reading
54 people want to read

About the author

Arieh Ben-Naim

65 books14 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
8 (42%)
4 stars
6 (31%)
3 stars
3 (15%)
2 stars
1 (5%)
1 star
1 (5%)
Displaying 1 of 1 review
99 reviews13 followers
December 11, 2016
A monograph about the simple premise that Boltzmann entropy is equivalent to Shannon entropy.

He spends the main part of the books talking about how neat his idea is and how neat it would be for everyone to get on board.

Only in the copious appendices are there examples of the potential insight of this premise.

It would have been better if it were more like the appendices. And if it included Kolmogorov complexity. And if it included parastatistics.
Displaying 1 of 1 review

Can't find what you're looking for?

Get help and learn more about the design.