I think this epistemic uncertainty is distinct from the type of “objective probabilities” I talk about in my post, and I don’t really know how to use language without referring to degrees of my epistemic uncertainty.
The part I was gesturing at wasn’t the “probably” but the “low measure” part.
Is your position that the problem is deeper than this, and there is no objective prior over worlds, it’s just a thing like ethics that we choose for ourselves, and then later can bargain and trade with other beings who have a different prior of realness?
Yes, that’s a good summary of my position—except that I think that, like with ethics, there will be a bunch of highly-suggestive logical/mathematical facts which make it much more intuitive to choose some priors over others. So the choice of prior will be somewhat arbitrary but not totally arbitrary.
I don’t think this is a fully satisfactory position yet, it hasn’t really dissolved the confusion about why subjective anticipation feels so real, but it feels directionally correct.
The part I was gesturing at wasn’t the “probably” but the “low measure” part.
Yes, that’s a good summary of my position—except that I think that, like with ethics, there will be a bunch of highly-suggestive logical/mathematical facts which make it much more intuitive to choose some priors over others. So the choice of prior will be somewhat arbitrary but not totally arbitrary.
I don’t think this is a fully satisfactory position yet, it hasn’t really dissolved the confusion about why subjective anticipation feels so real, but it feels directionally correct.