Perhaps I should bring up a point about probabilistic reasoning here. If you believe that a proposition is true with probability 1, then you cannot rationally change your belief away from probability 1. This is a consequence of Bayes’ theorem. So really, nobody believes any empirical fact with a probability of 1 or 0.
The last sentence shouldn’t be “nobody” but “no Bayesian rationalist.”
what interpretation of the word “probability” does allow you to think that the probability of something is 1 and then change to something other than 1?
As far as I know a frequentist could never do this. They’d need an infinitely long sequence of experiments to think that the probability of an event was 1?
what interpretation of the word “probability” does allow you to think that the probability of something is 1 and then change to something other than 1?
Any interpretation where you can fix a broken model. I can imagine a conversation like this...
Prankster: I’m holding a die behind my back. If I roll it, what probability would you assign to a 1 coming up?
cupholder: Is it loaded?
Prankster: No.
cupholder: Are you throwing it in a funny way, like in one of those machines that throws it so it’s really likely to come up a 6 or something?
Prankster: No, no funny tricks here. Just rolling it normally.
cupholder: Then you’ve got a 1⁄6 probability of rolling a 1.
Prankster: And what about rolling a 2?
cupholder: Well, the same.
Prankster: And so on for all the other numbers, right?
cupholder: Sure.
Prankster: So you assign a probability of 1 to a number between 1 and 6 coming up?
cupholder: Yeah.
Prankster: Surprise! It’s 20-sided!
cupholder: Huh. I’d better change my estimate from 1 to 6⁄20.
what interpretation of the word “probability” does allow you to think that the probability of something is 1 and then change to something other than 1?
They need to have inconsistent attitudes about how they calculate probability, or estimate probabilities by inherently irrational means such as assigning likelyhood based on what hypothesis they want to be true the most and acting like that belief is certain. Empirically, I’ve met individuals who claim that no amount of evidence would alter some of their beliefs so something like this may be going on. It is however possible that trying to model these beliefs as probabilities implies a degree of rationality that they simply lack. The human mind is not generally a good Bayesian.
It may be a matter of language use—if I assign something a probability of 1, it means that everything I know now points in that direction, but I leave the possibility open that I might come to know more.
I think my underlying premise is “no evidence could ever convince me otherwise” is so ridiculous that it doesn’t need to be included in the schema.
The last sentence shouldn’t be “nobody” but “no Bayesian rationalist.”
what interpretation of the word “probability” does allow you to think that the probability of something is 1 and then change to something other than 1?
As far as I know a frequentist could never do this. They’d need an infinitely long sequence of experiments to think that the probability of an event was 1?
Any interpretation where you can fix a broken model. I can imagine a conversation like this...
Prankster: I’m holding a die behind my back. If I roll it, what probability would you assign to a 1 coming up?
cupholder: Is it loaded?
Prankster: No.
cupholder: Are you throwing it in a funny way, like in one of those machines that throws it so it’s really likely to come up a 6 or something?
Prankster: No, no funny tricks here. Just rolling it normally.
cupholder: Then you’ve got a 1⁄6 probability of rolling a 1.
Prankster: And what about rolling a 2?
cupholder: Well, the same.
Prankster: And so on for all the other numbers, right?
cupholder: Sure.
Prankster: So you assign a probability of 1 to a number between 1 and 6 coming up?
cupholder: Yeah.
Prankster: Surprise! It’s 20-sided!
cupholder: Huh. I’d better change my estimate from 1 to 6⁄20.
They need to have inconsistent attitudes about how they calculate probability, or estimate probabilities by inherently irrational means such as assigning likelyhood based on what hypothesis they want to be true the most and acting like that belief is certain. Empirically, I’ve met individuals who claim that no amount of evidence would alter some of their beliefs so something like this may be going on. It is however possible that trying to model these beliefs as probabilities implies a degree of rationality that they simply lack. The human mind is not generally a good Bayesian.
It may be a matter of language use—if I assign something a probability of 1, it means that everything I know now points in that direction, but I leave the possibility open that I might come to know more.
I think my underlying premise is “no evidence could ever convince me otherwise” is so ridiculous that it doesn’t need to be included in the schema.