I’ve been thinking about entropy a lot these days, not just in the usual physical systems with atoms and such sense, but in the sense of “relating log probabilities to description length and coming up with a way of generating average-case short descriptions, then measuring the length of the description for the system and calling it entropy”. So I might just run wild with it.
Lies and manipulation in large organizations. This tends to an [immoral maze] / [high simulacrum level] equilibrium where people don’t talk about object level things and mostly talk about social (un)realities. This is related to entropy and shortest length descriptions because there are more ways to talk about social realities than there are ways to talk about object level truths.
Physical entropy. This equilibrates at a maximum that’s related to how big / complex the system is (how long of a string it would take to describe the least likely state in the system when you use a system of descriptions that tends to produce shortest length descriptions). Similar to the previous case, this has something to do with how there are many, many states with low likelyhoods and long descriptions.
Life (as in, cells that are lit (“alive”) and move in complicated and interesting ways in Conway’s game of life. They tend to get locked into repeating patterns or die out. I don’t have a good intuitive explanation for this, just that there are a lot of ways things could die, and not many ways things could come alive.
I would love it if somebody could critique my examples and help me get a deeper understanding of entropies and equilibria. I have a vague intuition about how, in order to count states and assign probabilities, you really really need to look at how state transitions work, and how entropy is somewhat related to some sort of “phase space volume” that isn’t necessarily conserved depending on how you’re looking at a system. I feel like there’s probably a lesswrong post I haven’t seen somewhere that would fill in my gap here.
If there isn’t, I would love to get some encouragement and write one
I’ve been thinking about entropy a lot these days, not just in the usual physical systems with atoms and such sense, but in the sense of “relating log probabilities to description length and coming up with a way of generating average-case short descriptions, then measuring the length of the description for the system and calling it entropy”. So I might just run wild with it.
Lies and manipulation in large organizations. This tends to an [immoral maze] / [high simulacrum level] equilibrium where people don’t talk about object level things and mostly talk about social (un)realities. This is related to entropy and shortest length descriptions because there are more ways to talk about social realities than there are ways to talk about object level truths.
Physical entropy. This equilibrates at a maximum that’s related to how big / complex the system is (how long of a string it would take to describe the least likely state in the system when you use a system of descriptions that tends to produce shortest length descriptions). Similar to the previous case, this has something to do with how there are many, many states with low likelyhoods and long descriptions.
Life (as in, cells that are lit (“alive”) and move in complicated and interesting ways in Conway’s game of life. They tend to get locked into repeating patterns or die out. I don’t have a good intuitive explanation for this, just that there are a lot of ways things could die, and not many ways things could come alive.
I would love it if somebody could critique my examples and help me get a deeper understanding of entropies and equilibria. I have a vague intuition about how, in order to count states and assign probabilities, you really really need to look at how state transitions work, and how entropy is somewhat related to some sort of “phase space volume” that isn’t necessarily conserved depending on how you’re looking at a system. I feel like there’s probably a lesswrong post I haven’t seen somewhere that would fill in my gap here.
If there isn’t, I would love to get some encouragement and write one