I like your example because it mirrors my thinking about the Sleeping Beauty puzzle, and brings it out even more strongly: whether the 1⁄2 or 1⁄3 answer (or 50% or 90% answer) is appropriate depends on which probability one is interested in.
My question: which world contains the more rational people?
Depends on how you define someone being rational/well-calibrated.
In Halfer Country, when someone says they’re 50% sure of having cancer, they do indeed have a 50% chance of having cancer.
In Thirder Land, any time someone makes the statement ‘I’m 90% sure I have cancer’, the statement has a 90% chance of coming from someone who has cancer.
Some of us were evidently born in Thirder Land, others in Halfer Country; my intuition works halferly, but the problem’s a bit like a Necker cube for me now—if I think hard enough I can press myself into seeing the other view.
Your comment and neq1′s intuition pump prompted me to create the following reformulation of the problem without amnesia:
I flip a coin hidden from you, then ask you to name a number. If the coin came up heads, I write your answer into my little notebook (which, incidentally, is all you care about). If it came up tails, I write it in the notebook twice.
When the problem is put this way, it’s clear that the answer hinges on how exactly you care about my notebook. Should it matter to us how many times we express our credence in something?
You make a good point. However, I’d argue that those in Thirder Land had nothing to update on. In fact, it’s clear they didn’t since they all give the same answer. If 50% of the population has cancer, but they all think they do with 0.9 probably, that’s not necessarily less accurate than if everyone thinks they have cancer with 0.5 probability (depends on your loss function or whatever). But the question here is really about whether you had evidence to shift from .5 to .9.
I like your example because it mirrors my thinking about the Sleeping Beauty puzzle, and brings it out even more strongly: whether the 1⁄2 or 1⁄3 answer (or 50% or 90% answer) is appropriate depends on which probability one is interested in.
Depends on how you define someone being rational/well-calibrated.
In Halfer Country, when someone says they’re 50% sure of having cancer, they do indeed have a 50% chance of having cancer.
In Thirder Land, any time someone makes the statement ‘I’m 90% sure I have cancer’, the statement has a 90% chance of coming from someone who has cancer.
Some of us were evidently born in Thirder Land, others in Halfer Country; my intuition works halferly, but the problem’s a bit like a Necker cube for me now—if I think hard enough I can press myself into seeing the other view.
Your comment and neq1′s intuition pump prompted me to create the following reformulation of the problem without amnesia:
I flip a coin hidden from you, then ask you to name a number. If the coin came up heads, I write your answer into my little notebook (which, incidentally, is all you care about). If it came up tails, I write it in the notebook twice.
When the problem is put this way, it’s clear that the answer hinges on how exactly you care about my notebook. Should it matter to us how many times we express our credence in something?
You make a good point. However, I’d argue that those in Thirder Land had nothing to update on. In fact, it’s clear they didn’t since they all give the same answer. If 50% of the population has cancer, but they all think they do with 0.9 probably, that’s not necessarily less accurate than if everyone thinks they have cancer with 0.5 probability (depends on your loss function or whatever). But the question here is really about whether you had evidence to shift from .5 to .9.