But does this take into account the fact that one large fluctuation can give rise to trillions of brains? Enough so that it would be more likely that an observer would be in one of these larger ones?
Yeah, it does. The probabilities involved here are ridiculously unbalanced. The frequency of a fluctuation (assuming ergodicity) is exponentially related to its entropy, so even small differences in entropy correspond to large differences in probability. And the difference in entropy here is itself huge. For comparison, it’s been estimated that a fluctuation into our current macroscopic universe would be likelier than a fluctuation into the macroscopic state of the very early universe by a factor of about 10^(10^101).
Under this idea, wouldn’t the Boltzmann brain have an equivalent belief about “Barack Obama”, which would correspond to some isomorphic thing in its environment? And then, wouldn’t this be extremely unlikely, since by definition, the Boltzmann brain is in a higher entropy place (as it would observe a world isomorphic to ours, which has relatively low entropy)?
Not sure what you’re getting at here. The belief state in the Boltzmann brain wouldn’t be caused by some external stable macroscopic object. It’s produced by the chance agglomeration of microscopic collisions (in Boltzmann’s model).
I get why many of my other comments on this post (and the post itself) have been downvoted, but I can’t figure out why the parent of this comment has been downvoted. Everything in it is fairly uncontroversial science, as far as I know. Does someone disagree with the claims I make in that comment? If so, I’d like to know! The possibility that I might be saying false things about the science bothers me.
The belief state in the Boltzmann brain wouldn’t be caused by some external stable macroscopic object.
I don’t think it matters what caused the belief. Just that if it had the same state as your brain, that state would correspond to a brain that observed a place with low entropy.
I’m still having trouble understanding your point. I think there is good reason to think the brain does not in fact have any beliefs. Beliefs, as we understand them, are produced by certain sorts of interactions between brains and environments. The Boltzmann brain’s brain states are not attributable to interactions of that sort, so they are not beliefs. Does that help, or am I totally failing to get what you’re saying?
Yeah, it does. The probabilities involved here are ridiculously unbalanced. The frequency of a fluctuation (assuming ergodicity) is exponentially related to its entropy, so even small differences in entropy correspond to large differences in probability. And the difference in entropy here is itself huge. For comparison, it’s been estimated that a fluctuation into our current macroscopic universe would be likelier than a fluctuation into the macroscopic state of the very early universe by a factor of about 10^(10^101).
Not sure what you’re getting at here. The belief state in the Boltzmann brain wouldn’t be caused by some external stable macroscopic object. It’s produced by the chance agglomeration of microscopic collisions (in Boltzmann’s model).
I get why many of my other comments on this post (and the post itself) have been downvoted, but I can’t figure out why the parent of this comment has been downvoted. Everything in it is fairly uncontroversial science, as far as I know. Does someone disagree with the claims I make in that comment? If so, I’d like to know! The possibility that I might be saying false things about the science bothers me.
Oh okay then.
I don’t think it matters what caused the belief. Just that if it had the same state as your brain, that state would correspond to a brain that observed a place with low entropy.
I’m still having trouble understanding your point. I think there is good reason to think the brain does not in fact have any beliefs. Beliefs, as we understand them, are produced by certain sorts of interactions between brains and environments. The Boltzmann brain’s brain states are not attributable to interactions of that sort, so they are not beliefs. Does that help, or am I totally failing to get what you’re saying?
But wouldn’t a Boltzmann brain understand its “beliefs” the same way, despite them not corresponding to reality?