I don’t think you need to claim that there are different kinds of uncertainty to solve these. If you clearly specify what predicted experiences/outcomes you’re applying the probability to, both of these examples dissolve.
“Will you remember an awakening” has a different answer than “how many awakenings will be reported to you by an observer”. Uncertainty about these are the same: ignorance.
I don’t think you need to claim that there are different kinds of uncertainty to solve these. If you clearly specify what predicted experiences/outcomes you’re applying the probability to, both of these examples dissolve.
This implies that everyone arguing about the correct probability in Sleeping Beauty is misguided, right?
I definitely think it is essential to differentiate between the two. I think there are cases where the question is the same and meaningful but the answer changes as the nature of uncertainty changes. Presumptuous Philosopher is such a case.
I argue more that the results of this model are meaningful in the next post.
I don’t think you need to claim that there are different kinds of uncertainty to solve these. If you clearly specify what predicted experiences/outcomes you’re applying the probability to, both of these examples dissolve.
“Will you remember an awakening” has a different answer than “how many awakenings will be reported to you by an observer”. Uncertainty about these are the same: ignorance.
This implies that everyone arguing about the correct probability in Sleeping Beauty is misguided, right?
I definitely think it is essential to differentiate between the two. I think there are cases where the question is the same and meaningful but the answer changes as the nature of uncertainty changes. Presumptuous Philosopher is such a case.
I argue more that the results of this model are meaningful in the next post.
Yes, everyone arguing that there is a correct probability without definition of what that probability is predicting is misguided.