By Arieh Ben-Naim
The critical message of this publication is that thermodynamics and statistical mechanics will reap the benefits of changing the unlucky, deceptive and mysterious time period entropy with a extra prevalent, significant and acceptable time period similar to info, lacking info or uncertainty. This substitute could facilitate the translation of the motive force of many techniques by way of informational alterations and dispel the secret that has consistently enshrouded entropy.
it's been one hundred forty years for the reason that Clausius coined the time period entropy ; nearly 50 years when you consider that Shannon built the mathematical concept of knowledge consequently renamed entropy. during this e-book, the writer advocates changing entropy through info, a time period that has turn into established in lots of branches of technological know-how.
the writer additionally takes a brand new and ambitious method of thermodynamics and statistical mechanics. details is used not just as a device for predicting distributions yet because the basic cornerstone suggestion of thermodynamics, held beforehand by means of the time period entropy.
the subjects coated comprise the basics of likelihood and data concept; the overall thought of data in addition to the actual idea of data as utilized in thermodynamics; the re-derivation of the Sackur Tetrode equation for the entropy of a terrific fuel from only informational arguments; the basic formalism of statistical mechanics; and plenty of examples of straightforward approaches the motive force for that's analyzed by way of info.
Contents: parts of chance idea; components of data idea; Transition from the final MI to the Thermodynamic MI; The constitution of the principles of Statistical Thermodynamics; a few uncomplicated purposes.
Read or Download A Farewell To Entropy: Statistical Thermodynamics Based On Information PDF
Similar thermodynamics and statistical mechanics books
The fundamentals of aerothermodynamics are taken care of during this booklet with exact regard to the truth that outer surfaces of hypersonic cars essentially are radiation cooled. the results of this truth are assorted for various motor vehicle sessions. as a minimum the homes of either connected viscous and separated flows are of significance during this regard.
- Challenges to the second law of thermodynamics theory and experiment
- rational thermodynamics
- Technische Thermodynamik: Eine Einfuehrung in die Thermo- und Gasdynamik
- Phase transition dynamics
- Theory of Heat
- Termodinamica molecular de los equilibrios de fase
Extra info for A Farewell To Entropy: Statistical Thermodynamics Based On Information
15) or equivalently T = m v2 . 16) Hence A=N =N m v2 m v2 −N S/N 2 3 m v2 2 1− 2S . 17) includes the entropy, or rather the missing information (M I). Since S is an extensive quantity, S/N is the M I per particle in this system. 2 The Association of Entropy with Disorder During over a hundred years of the history of entropy, there have been many attempts to interpret and understand entropy. We shall discuss the two main groups of such interpretations of entropy. The earliest, and nowadays, the most common interpretation of the entropy is in terms of disorder, or any of the related concepts such as “disorganization,” “mixed-upness,” “spread of energy,” “randomness,” “chaos” and the like.
It was developed in the 16th and 17th centuries. The theory emerged mainly from questions about games of chances addressed to mathematicians. A typical question that is said to have been addressed to Galileo Galilei (1564–1642) was the following: Suppose that we play with three dice and we are asked to bet on the sum of the outcomes of tossing the three dice simultaneously. Clearly, we feel that it would not be wise to bet our chances on the outcome 3, nor on 18; our feeling is correct (in a sense discussed below).
Thus, neither the entropy, nor the Shannon measure of MI, are subjective quantities. In fact, no one has claimed that either of these is subjective. The subjectivity of information enters only when we apply the concept of information in its broader sense. Jaynes pioneered the application of information theory to statistical mechanics. In this approach, the fundamental probabilities of statistical mechanics (see Chapter 5) are obtained by using the principle of maximum entropy. This principle states that the equilibrium distributions are obtained by maximizing the entropy (or the MI) with respect to all other distributions.
A Farewell To Entropy: Statistical Thermodynamics Based On Information by Arieh Ben-Naim