I think this is the OP’s point—there is no (human) mind capable of caring, because human brains aren’t capable of modelling numbers that large properly. If you can’t contain a mind, you can’t use your usual “imaginary person” modules to shift your brain into that “gear”.
So—until you find a better way! - you have to sort of act as if your brain was screaming that loudly even when your brain doesn’t have a voice that loud.
I think this is the OP’s point—there is no (human) mind capable of caring, because human brains aren’t capable of modelling numbers that large properly. If you can’t contain a mind, you can’t use your usual “imaginary person” modules to shift your brain into that “gear”.
So—until you find a better way! - you have to sort of act as if your brain was screaming that loudly even when your brain doesn’t have a voice that loud.
Why should I act this way?
To better approximate a perfectly-rational Bayesian reasoner (with your values.)
Which, presumably, would be able to model the universe correctly complete with large numbers.
That’s the theory, anyway. Y’know, the same way you’d switch in a Monty Haul problem even if you don’t understand it intuitively.