Talk:Maximum entropy thermodynamics
Physics Start‑class Mid‑importance | ||||||||||
|
This article was nominated for deletion on 28 October 2005. The result of the discussion was keep. |
From stub to article
To do:
- technical note that strictly the entropy should be relative to a prior measure. -> Principle of minimum cross-entropy (Kullback-Leibler distance). In thermodynamics we usually assume the "principle of equal a-priori probability" over phase space, so the two are then equivalent.
- section on philosophical implications regarding the conceptual problems of statistical mechanics, second law, etc.
- -- (?) DONE Jheald 22:07, 2 November 2005 (UTC)
- (?) some more algebra, and a simple nonequilibrium example (eg Brownian motion?)
-- Jheald 12:47, 28 October 2005 (UTC)
Introduction could be more friendly
(from the Article deletion page):
Note to author - please consider adding a few paragraphs up front in layman talk before getting on to the partial differentials. There ought to be something you can say about maximum entropy that I can slip into a casual conversation. Denni☯ 23:56, 28 October 2005 (UTC)
- You know, I tend to agree with you. But the problem is that whatever egghead wrote the damn article probably would have real trouble talking to a regular human being. Someone that bright probably would have trouble answering a question as easy as "Hey man, what's up?" You'd probably get as an answer some canned formulaic response that they'd learned to the question if not some Larry-Wallesque humor. If anything, I really fucking like this article? It reminds me of the good old days of wikipedia, when PhD level researchers from a buncha universities would pop on during a coffee break and spit out (arguably useful) articles on cutting edge theory, heavy academia or particle physics. Eventually the whole project got taken over by hack librarians, star trek bloggers and geeky high-school students, with the result that they pretty much demanded a level of organization to the writing that your average professor was unable/unwilling to do on their coffee break. This scared them off and most articles, even about important subjects, end up bland, overly-verifiable and smelling like they were written by some windex-and-spam fried committee. Honestly, the only good articles left on wikipedia are the ones on topics so obscure that people find nothing in them to be contentious. Like this article, for example. But anyways, brother I certainly see your points. Sorry for the outburst. Long-live Myspace, Facebook, Wal-Mart and Tom Hanks. —Preceding unsigned comment added by 10.250.65.158 (talk) 01:22, 6 October 2007 (UTC)
- Me again. There's the additional problem that sufficiently complex information reaches a point of irreducibility beyond which any further boiling deprives the subject-matter of its inherit factuality. (c.f. What the Bleep do we know http://en.wikipedia.org/wiki/What_the_Bleep_Do_We_Know%21%3F , a film designed to make quantum mechanics comprehensible to people of average intellect and new-age spiritual interest). You try to make something like that so that regular folks can understand it, and the result is a hodge-podge of miswrought analogies and irreconcilable metaphors. Really, people are better off knowing that they just don't fucking know anything about a particular subject than thinking they can define five or ten vocabulary words of the industry jargon.
This entry is hard to follow
I have done a fair bit of reading and writing of Wiki entries on math and science, and this entry strikes me as one of the hardest to follow out of all those I've read. This entry badly needs to be turned around, but I don't quite know where to begin. If anyone conjectures that I am incompetent to form an opinion of this nature, let me point out that my education included Bayesian statistics and Shannon information.
This entry is important simply because E.T. Jaynes founded its subject with his 1957 Princeton Ph.D. thesis written under Eugene Wigner; Wigner was a worthy fellow, and Jaynes became one himself. In fact, I first encountered his name while learning Bayesian statistics. I think of Jaynes as having advanced the legacy of Gibbs and Boltzmann. In any event, this entry should also include a paragraph giving the history of its subject.
Another reason why this topic is a worthy one is that subjective probability and Bayesian statistics can be a powerful tool in the philosophy of science; see, for example, the work of Nick Bostrom. If the data can be adequately summarized via a likelihood function, then Bayes Rule is a powerful way of reasoning inductively from those data. Hence I am surprised that this entry does not mention Bayes's Rule.123.255.63.50 (talk) 06:25, 25 December 2008 (UTC)
Average entropy, measured entropy and entropy fluctuations
At the moment the article isn't very clear as to when it's talking about expectations (either as a constraint, or a prediction), and actual measurements. For example, in the discussion of the 2nd law, the measured macroscopic quantities probably won't come in bang on the nose of the predicted -- instead (we assume) they will be within the margin of predicted uncertainty.
This especially needs to be much cleaned up in the context of entropy, particularly if we're going to discuss the fluctuation theorem.
Also, the new measurements will therefore contain (a little) new information, over and beyond the predicted distribution. So it's not quite true that SI is unchanged. It will still be a constant, but strictly speaking it will become a different contstant, as we propagate back the new information, sharpening up our phase-space distribution for each instant back in time.
-- Jheald 15:51, 1 November 2005 (UTC)
- -- DONE Jheald 22:07, 2 November 2005 (UTC)
meaning of the words 'subjective' and 'objective'
The article at present, in a section called 'The nature of probabilities in statistical mechanics', writes: "According to the MaxEnt viewpoint, the probabilities in statistical mechanics are subjective (epistemic, personal), to the extent that they are conditioned on a particular model for the underlying state space (e.g. Liouvillian phase space). They are also conditioned on a particular partial description of the system (the macroscopic description of the system used to constrain the MaxEnt probability assignment). The probabilities are objective to the extent that given these inputs, a uniquely defined probability distribution will result." [bold type by present talk page writer here]
I think this is not quite right. It is true that critics of the MaxEnt viewpoint view the MaxEnt viewpoint as dealing with 'subjective (personal) probabilities', but the holders of the MaxEnt viewpoint do not do so. The holders of the MaxEnt viewpoint hold that the probabilities are simply objective and epistemic. They hold that being epistemic logically precludes them from being subjective or personal. The root of the word epistemic, used by Aristotle, is usually translated as "scientfic knowledge", in direct and explicit contrast with "opinion", which is regarded as essentially subjective and personal. In view of this tradition of thinking and language, it is self-contradictory to say that something is both epistemic and subjective/personal. I think all will agree that in science, to label something as 'subjective/personal' is pejorative. That the probabilities are conditioned on a particular model does not make them subjective or personal unless one hides or conceals or fails to state the model. The MaxEnt people are keen to put the model upfront and explicit, and not to hide it. Indeed, "the probabilities are objective because given these inputs, as is the case in a properly stated proposition, a uniquely defined probability distribution will result." It is according only to the opponents of the MaxEnt viewopoint that the MaxEnt viewpoint deals with "subjective/personal" probabilities. The sentence in the article that says otherwise is not an accurate statement, and should be deleted. Perhaps it could be replaced, if one was keen, by a sentence like "According to the opponents of the MaxEnt viewpoint, it deals with subjective or personal probabilities, but the holders of the MaxEnt viewpoint categorically deny this pejorative mischaracterization."
The article at present continues: "At a trivial level, the probabilities cannot be entirely objective, because in reality there is only one system, and (assuming determinism) a single unknown trajectory it will evolve through. The probabilities therefore represent a lack of information in the analyst's macroscopic description of the system, not a property of the underlying reality itself." This is not an argument that the probabilities are not objective. It is an objective fact that the analyst is stating that he has a lack of information. Objectivity is not about direct contact with the underlying reality itself: it is is about how the situation is conceived and described: objectivity is an epistemic notion not an ontological one. The two quoted sentences are inaccurate, misleading, and should be deleted. Chjoaygame (talk) 05:54, 17 October 2009 (UTC)
note of intention to edit
I have above proposed to make some edits to the effect that 'subjective' is not the right word to label the epistemic meaning of probability. The word 'subjective' is used by opponents of MaxEnt theory as a politely pejorative epithet, but that something is epistemic does not make it subjective. MaxEnt people do not think that probability is subjective. So far, no replies to my above remarks that are preliminary to my efforts at editing to this effect. Anyone with a comment?Chjoaygame (talk) 07:00, 23 October 2009 (UTC)