![]() To express the unusable heat lost, he defined entropy-etymologically, a transformation of energy content-which measures how spontaneously a hot body gives up heat to a cold body as the system tends to equilibrium, unless interfered with to prevent it. ![]() Clausius was seeking to explain mathematically the workings of energy in the Carnot heat engine, an optimised model of the heat engine-on which the original diesel engine was based-proposed four decades earlier by the French engineer Sadi Carnot. When the Prussian physicist Rudolf Clausius defined entropy in 1865, the idea of disorder was nowhere to be found. But what does entropy actually mean? The Austrian Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning, understood as the probability distribution between the different possible microstates. And yet, with this approximate meaning, it has almost become part of the lexicon of everyday life. Except that, in reality, it won’t: physicists are constantly explaining that no, entropy does not mean disorder. If we don’t get our house in order, we are told, entropy will eat us alive. And, of course, we all know what entropy means: disorder. ![]() We find it in the unintelligible phrases of some famous spiritual guru, and even in self-help advice and motivational coaching. ![]() Thus my advice to you: when someone talks about entropy and it's not a chemical process engineer talking about the thermodynamic efficiency of the industrial chemical process that he is responsible for.There are certain words that can embellish any speech or quotation. We haven't even begun to scratch the mathematical surface of these things. What Newton couldn't know and what is not at all captured by terms like entropy is that there is a world of utterly complex dynamics between integrable mechanical systems and the thermodynamic limit. Physicists knew already at the end of the 19th century that this is not true and mathematically the "trouble" with dynamic systems was already known to Newton who failed to solve the three body problem. Part of the problem, as far as a I can tell, is that much of the discussion about the dynamics of physical (and biological!) systems is still being carried out at the level of the 19th century, when the world seemed divided into (and fully explainable by!) either mechanics or thermodynamics. However, while in physics entropy is extremely well defined in both thermodynamic and statistical mechanics terms, many characterizations in the layman literature seem questionable, at best, and there is quite a bit of overreach about the "meaning" and function of entropy IMHO. If we accept the ergodic hypothesis as "essentially correct" (in a physical, NOT in a strict mathematical sense), then we can recover thermodynamic entropy as a measure of order/disorder, which is definable in terms of counting microstates. While the ergodic hypothesis is quite plausible and physically very well realized for many systems, one can construct trivial cases for which it does not hold (especially for systems that are interaction free like the ideal gas), which should give us some pause with regards to the mathematical problem with ergodicity. This link is given by the unproven and most likely unprovable ergodic hypothesis. One has to understand the link between thermodynamics and statistical mechanics to get a sense of where the second meaning of entropy comes into play. This has, at face value, nothing to do with order and disorder because there is no obvious way to even define structure in thermodynamics. It's the integral of the reversible heat flow divided by the temperature at which the flow occurs. The definition of entropy can be found on Wikipedia.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |