Talk:Maximum entropy thermodynamics
From stub to article
To do:
- technical note that strictly the entropy should be relative to a prior measure. -> Principle of minimum cross-entropy (Kullback-Leibler distance). In thermodynamics we usually assume the "principle of equal a-priori probability" over phase space, so the two are then equivalent.
- section on philosophical implications regarding the conceptual problems of statistical mechanics, second law, etc.
- (?) some more algebra, and a simple nonequilibrium example (eg Brownian motion?)
-- Jheald 12:47, 28 October 2005 (UTC)
Introduction could be more friendly
(from the Article deletion page):
Note to author - please consider adding a few paragraphs up front in layman talk before getting on to the partial differentials. There ought to be something you can say about maximum entropy that I can slip into a casual conversation. Denni☯ 23:56, 28 October 2005 (UTC)
Average entropy, measured entropy and entropy fluctuations
At the moment the article isn't very clear as to when it's talking about expectations (either as a constraint, or a prediction), and actual measurements. For example, in the discussion of the 2nd law, the measured macroscopic quantities probably won't come in bang on the nose of the predicted -- instead (we assume) they will be within the margin of predicted uncertainty.
This especially needs to be much cleaned up in the context of entropy, particularly if we're going to discuss the fluctuation theorem.
Also, the new measurements will therefore contain (a little) new information, over and beyond the predicted distribution. So it's not quite true that SI is unchanged. It will still be a constant, but strictly speaking it will become a different contstant, as we propagate back the new information, sharpening up our phase-space distribution for each instant back in time.