![]() | This is an archive of past discussions about Introduction to entropy. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 |
There is at least one objection above to leading off with the statistical-mechanical description. There is also at least one insistence that motion be mentioned in the first paragraph. If this article is to be about entropy in general — including the popular concept and information entropy — then it's inappropriate to lead off with a purely thermodynamic account. The statistical-mechanical account applies to both thermodynamics and information, and it explains how the physical concept led to the popular concept. That it "explains but does not define" thermodynamic entropy is a subtlety that seriously gums up the mission of getting entropy across to the English major or the financial analyst. I urge certain editors, for the nth time, to get over it in deference to that mission. If they absolutely cannot, perhaps the distinction can be addressed in the body of the article. -Jordgette [talk] 15:22, 5 December 2020 (UTC)
Sounds like the discussion is getting bogged down in generalities again. Chjoaygame continues his "wall of words" replies in which he appeals to "authorities" such as Edwin Thompson Jaynes, Edward A. Guggenheim, Peter Atkins, and college textbook authors. WP:TLDR territory. I don't see these as too relevant to this article, at least not the introductory sections. These scientists were writing for other scientists, physicists and chemists, or students of physics and chemistry at the university level. We are writing for readers at a lower level, who have only taken high school or middle school science or maybe no science at all. That means tailoring explanations to their level of understanding, as I think Jordgette and DrPippy are trying to do. We don't need to be limited to one approach, or form of words. Chjoaygame, I think if you want to appeal to textbooks for wording, the appropriate ones are high school or middle school textbooks.
I would say a more productive use of time would be to start editing the article WP:BRD. One or more editors could write a new introduction. Then arguments could be about specific wording, or whether their approach should be abandoned. --ChetvornoTALK 18:41, 6 December 2020 (UTC)
Withdrawn proposals & ensuing discussion
|
---|
Hybrid version of 1st paragraphThank you, Chetvorno. I'm uncomfortable with doing a BRD on the intro, as we still haven't settled on a solid direction for the first paragraph. I shall now attempt a hybrid version of my and DrPippy's proposed first paragraphs, taking into consideration points raised by other editors:
One will note: "Understood" rather than "defined." "Spread" rather than "evenly distributed." "That we recognize as 'disordered'" rather than "disordered." -Jordgette [talk] 20:38, 6 December 2020 (UTC)
Proposed wording of lead, version 1Entropy is a concept in physics, specifically the field of thermodynamics, that has entered popular usage and is also employed in information theory. Entropy is a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement. In popular usage, entropy is often considered to be a measurement of disorder, or to refer a lack of order or predictability, or of a gradual decline into disorder.[citation] The equivalence between entropy and disorder arises because states that we recognize as "disordered" almost always have higher entropy than "ordered" states. For example, there are relatively few ways to organize a deck of cards so that it is separated by suit, compared to the number of arrangements where the suits are mixed together. Similarly, there are relatively few ways that particles in a concentrated puff of smoke can be arranged, compared to the number of arrangements after the smoke has spread throughout a room. A shuffled deck and spread-out smoke both have higher entropy than their well-ordered counterparts. Consider the animation of blue and red balls at right [in progress], which can be viewed as a schematic representation of molecules of a gas at two different temperatures, or molecules of two different gases. The balls start out separated, an arrangement we might describe as an "ordered state", with a particular value for the system's entropy. If each ball is allowed to move at random and displace other balls (as happens among gas molecules possessing heat energy), the balls do not stay separated for long. They spontaneously begin to blend, as balls of each color spread among the balls of the other color. In doing so, the entropy of the system rises: the number of arrangements of individual balls that would produce any given overall arrangement goes up considerably. At some point, however, the spread of each color into the other reaches a maximum, and we can no longer discern a "red side" or a "blue side"; any further movement does not appreciably change the situation. In thermodynamics, this point — at which entropy reaches a maximum — is called equilibrium. The second law of thermodynamics is one of the foundational principles of physics; it states that the entropy of a closed system (i.e., one with no outside influences) tends to increase over time until equilibrium is reached. For example, it is extremely improbable for the entropy of the randomly moving colored balls to decrease, i.e., for the balls to spontaneously regroup back into a "red side" and a "blue side." Likewise, it is extremely improbable for particles of smoke spread throughout a room to reform into a concentrated puff, or for a shuffled deck of cards, upon further shuffling, to spontaneously become reordered by suit. The second law implies that many physical processes are irreversible. You can pour cream into coffee and mix it, but you cannot "unmix" it; you can burn a piece of wood, but you can't "unburn" it. If you saw a movie of smoke going back into a smokestack or mixed coffee separating into black coffee and cream, you would know that the movie had been reversed. In some cases, however, the entropy of a changing system increases very little. When two billiard balls collide, the change in entropy is very small (a bit of their kinetic energy is lost to the environment as heat), so a reversed movie of their collision might appear normal. The question of why entropy increases, until equilibrium is reached, was answered in 1854 by Ludwig Boltzmann. The theory developed by Boltzmann and others, known as statistical mechanics, explains thermodynamics in terms of the statistical behavior of the atoms and molecules that make up the system. Later, Claude Shannon applied the concept of entropy to information, such that the entropy of a message transmitted along a wire can be calculated. Proposed sections of body: • History of entropy in thermodynamics • Entropy in information theory • Entropy in popular culture -Jordgette [talk] 17:40, 7 December 2020 (UTC)
Would another editor kindly parse the above and opine on which elements, if any, are relevant to this article and its intended audience? Clearly the Boltzmann date is wrong (should be 1872). Much of this text is copied from earlier proposals. Thank you. -Jordgette [talk] 15:45, 8 December 2020 (UTC)
Hi folks, sorry for the delay. Give me another couple of days to finish up the animation and put a version together incorporating the feedback. I'll plug it into the article for further feedback and refining, and then we'll move on to the body. -Jordgette [talk] 03:45, 11 December 2020 (UTC) |
I'm fundamentally at odds with PAR's position at this. The laws of thermodynamics emerge from statistical mechanics, not the other way around. The reason that entropy increases is fundamentally due to the statistical properties of large systems of particles; if thermodynamics didn't have the Second Law, statistical mechanics would demand it. If I'm understanding PAR's point about the relationship between thermo and stat mech (e.g., "If any statistical mechanical theory is ever at odds with thermodynamics, then it is simply wrong, and not vice versa."), it simply amounts to an argument that theories have to conform to observations. This is not in dispute, but doesn't really provide any useful guidance for why the thermodynamic view of entropy should take precedence over the statistically mechanical view. The fact of the matter is that stat mech entropy explains thermo entropy; the reverse is not true. Nor does thermo entropy provide a particularly clear explanation of why the second law implies irreversibility, tendency to disorder, etc.; the stat mech version does.
Thermodynamics says that there's some quantity defined by dS=dQ/T, and that dS>=0 for any spontaneous process in an isolated system, etc. This is sort of like observing that planetary orbits are elliptical. This observation is explained by Newton's law of gravity, which also explains a number of other phenomena. Arguing that we should start introducing the concept of entropy with the narrower thermodynamic definition strikes me as similar to arguing that we should start an introduction to gravity by talking about the shape of planetary orbits rather than with Newton's law.
I also think that the stat mech version is potentially easier to understand. I'm not sure that there's a way to frame entropy as a measurement of heat that's unavailable to do work (or whatever version of this you prefer) that isn't going to feel pretty abstract to the uninitiated; at least, I haven't heard one that really works for me. If we don't mind sacrificing some degree of technical rigor (and I think that's okay in this context), we could simply say that entropy is a measurement of the probability that a system will find itself in some particular state/arrangement/configuration, and thus the second law is simply a result of a system in a relatively improbably state moving to increasingly probably states as the deck is slowly shuffled, so to speak. In fact, I think I would prefer something along these lines even to the "number of arrangements" wording that Jordgette and I have been working with. (This definition is technically wrong, but it's wrong in sort of the same way that Newtonian gravity is wrong compared to GR, and you definitely wouldn't want to start off a layman's introduction to gravity by talking about geodesics in curved spacetime and all that.)
TL;DR version: stat mech provides an explanation of the concept of (physical) entropy which is more explanatory, more intuitive, and more fundamental than thermodynamics does, so we should lead with that. I would prefer to explain the entropy in terms of probability instead of multiplicity, but I feel less strongly about that.
We seem to be at a bit of an impasse: I think Jordgette and I are more or less on the same page, and possibly Bduke as well; PAR and Chjoaygame are not on that page, but their objections seem to some from different directions. Is it time to pursue some sort of dispute resolution process (RfC, etc.)? DrPippy (talk) 14:26, 12 December 2020 (UTC)
I thought it might be useful to take stock of where we are in terms of points of agreement or disagreement. So here's what I see as three major questions where we need to achieve consensus, together with what I've seen as the likely answers. (I trust y'all will add anything that I've missed here!) Not trying to break any new ground here; just hoping to provide an organizational framework where it might be easier to keep track of the discussion (which I'm having a bit of trouble following, tbh). Hopefully this is something we can !vote on, and it'll make it clearer where things stand. DrPippy (talk) 14:58, 12 December 2020 (UTC)
Option 1: No math at all
Option 2: No math in the introduction, some basic math in the body of the article
Option 3: Basic math throughout, including introduction
Option 4: The article should incorporate an advanced mathematical treatment of entropy; introduction might or might not have some math, but we should end up in the deep end.
Option 1: Thermodynamic
Option 2: Stat mech
Option 3: Informational
Option 4: Popular (disorder, etc.)
Other?: I'm sure I've left some out...
Option 1: Thermodynamic
Option 2: Stat mech
Option 3: Informational
Option 4: Popular (disorder, etc.)
Option 5: Other/none of the above/all of the above