>Suppose that I have a coin with probability of heads p. I certainly know that p is fixed and does not change as I toss the coin. I would like to express my degree of belief in p and then update it as I toss the coin.
It doesn’t change, because as you said, you “certainly know” that p is fixed and you know the value of p.
So if you would like to express your degree of belief in p, it’s just p.
>But let’s say I’m a super-skeptic guy that avoids accepting any statement with certainty, and I am aware of the issue of parametrization dependence too.
In that case use Bayes’ Theorem to update your beliefs about p. Presumably there will be no change, but there’s always going to be at least a tiny chance that you were wrong and your prior needs to be updated.
OK. But if you yourself state that you “certainly know”—certainly—that p is fixed, then you have already accounted for that particular item of knowledge.
If you do not, in fact, “certainly know” the probability of p—as could easily be the case if you picked up a coin in a mafia-run casino or whatever—then your prior should be 0.5 but you should also be prepared to update that value according to Bayes’ Theorem.
I see that you are gesturing towards assigning also the probability that the coin is a fair coin (or generally such a coin that has a p of a certain value). That is also amenable to Bayes’ Theorem in a normal way. Your prior might be based on how common biased coins are amongst the general population of coins, or somewhat of a rough guess based on how many you think you might find in a mafia-run casino. But by all means, your prior will become increasingly irrelevant the more times you flip the coin. So, I don’t think you need to be too concerned about how nebulous that prior and its origins are!
>Suppose that I have a coin with probability of heads p. I certainly know that p is fixed and does not change as I toss the coin. I would like to express my degree of belief in p and then update it as I toss the coin.
It doesn’t change, because as you said, you “certainly know” that p is fixed and you know the value of p.
So if you would like to express your degree of belief in p, it’s just p.
>But let’s say I’m a super-skeptic guy that avoids accepting any statement with certainty, and I am aware of the issue of parametrization dependence too.
In that case use Bayes’ Theorem to update your beliefs about p. Presumably there will be no change, but there’s always going to be at least a tiny chance that you were wrong and your prior needs to be updated.
Knowing (or assuming) that the value of p does not change between experiments is a different kind of knowledge than knowing the value of p.
OK. But if you yourself state that you “certainly know”—certainly—that p is fixed, then you have already accounted for that particular item of knowledge.
If you do not, in fact, “certainly know” the probability of p—as could easily be the case if you picked up a coin in a mafia-run casino or whatever—then your prior should be 0.5 but you should also be prepared to update that value according to Bayes’ Theorem.
I see that you are gesturing towards assigning also the probability that the coin is a fair coin (or generally such a coin that has a p of a certain value). That is also amenable to Bayes’ Theorem in a normal way. Your prior might be based on how common biased coins are amongst the general population of coins, or somewhat of a rough guess based on how many you think you might find in a mafia-run casino. But by all means, your prior will become increasingly irrelevant the more times you flip the coin. So, I don’t think you need to be too concerned about how nebulous that prior and its origins are!