Sean Carroll has a very sophisticated discussion of the physics concept of entropy at his blog in connection with a new paper written by him and several collaborators.
He discusses two complementary but distinct ways to define entropy conventionally (one based on macrostates and microstates and the other based on information known about the system), notes that the Second Law of Thermodynamics is true only statistically, reviews the derivation of the Second Law from two antecedent corollaries of it, and then generalized an information based definition of entropy to allow increased information to decrease the entropy of a system in an information based measurement in a Baysean manner, after adjusting for heat loss from the system.
This is really quite a ground breaking paper conceptually.
A post on entropy is also as good a place as any to note the premature passing of Jacob Bekenstein last week, an astrophysicist best known for his work on the entropy of black holes.
Obviously the relationship between entropy and information is as perennial a topic as, for example, the interpretation of quantum mechanics. Thousands of papers have been written, arguing for the correctness of a particular perspective, introducing new variations on the concepts, and so on.
ReplyDeleteI would be astounded if Sean's contribution were a significant one, since I have never yet seen him write a paper that I wanted to remember. But he blogs about them all and so they get talked about.
I recently learned of this and while it's intolerably long and obsessive, possibly even crackpot in origin, I did get the sense that there was something significant to be learned from it. So that's my counter-recommendation.