I’m making a separate reply for the betting thing, only to try to keep the two conversations clean/simple.
Let’s muddle through it:
If I have a box containing an unknown (to you) number of gumballs and I claim that there are an odd number of gumballs, you would actually be quite reasonable in assigning a 50% chance to my claim being true.
If I claim that the gumballs in the box are blue, would you say there is a 50% chance of my claim being true?
What if I claimed that I ate pizza last night?
You might have a certain level of confidence in my accuracy and my reliability as a person to not lie to you; and, if someone was taking bets, you would probably bet on how likely I am to tell the truth, rather than assuming there was a 50% chance that I ate pizza last night.
If you you then notice that my friend, who was with me last night, claims that I in fact ate pasta, then you have to weigh their reliability against mine, and more importantly now you have to start looking for reasons that we came to different conclusions about the same dinner. And finally, you have to weigh the effort it takes to vet our claims against how much you really care what I ate last night.
So, assuming you are rational, would you bet 50⁄50 that I ate pizza? Or would you just say “I don’t know” and refuse to bet in the first place?
This is a bit of a side-track. For the Bayesian interpretation of probability, it’s important to be able to assign a prior probability to any event (since otherwise you can’t calculate the posterior probability, given some piece of evidence that makes the event more or less probable). They do this using, e.g. the much contested principle of indifference. Some people object to this, and argue along your lines that it’s just silly to ascribe probabilities to events we know nothing about. Indeed, the frequentists define an event’s probability as the limit of its relative frequency in a large number of trials. Hence, to them, we can’t ascribe a probability to a one-off event at all.
Hence there is a huge discussion on this already and I don’t think that it’s meaningful for us to address it here. Anyway, you do have a point that one should be a bit cautious ascribing definite probabilities to events we know very little about. An alternative can be to say that the probability is somewhere in the interval from x to y, where x and y are some real numbers betwen 0 and 1.
I agree that it is largely off-topic and don’t feel like discussing it further here—I would like to point out that the principle of indifference specifies that your list of possibilities must be mutually exclusive and exhaustive. In practice, when dealing with multifaceted things such as claims about the effects of changing the minimum wage, an exhaustive list of possible outcomes would result in an assignment of an arbitrarily small probability according to the principle of indifference. The end effect is that it’s a meaningless assignment and you may as well ignore it.
I’m making a separate reply for the betting thing, only to try to keep the two conversations clean/simple.
Let’s muddle through it: If I have a box containing an unknown (to you) number of gumballs and I claim that there are an odd number of gumballs, you would actually be quite reasonable in assigning a 50% chance to my claim being true.
If I claim that the gumballs in the box are blue, would you say there is a 50% chance of my claim being true?
What if I claimed that I ate pizza last night?
You might have a certain level of confidence in my accuracy and my reliability as a person to not lie to you; and, if someone was taking bets, you would probably bet on how likely I am to tell the truth, rather than assuming there was a 50% chance that I ate pizza last night.
If you you then notice that my friend, who was with me last night, claims that I in fact ate pasta, then you have to weigh their reliability against mine, and more importantly now you have to start looking for reasons that we came to different conclusions about the same dinner. And finally, you have to weigh the effort it takes to vet our claims against how much you really care what I ate last night.
So, assuming you are rational, would you bet 50⁄50 that I ate pizza? Or would you just say “I don’t know” and refuse to bet in the first place?
This is a bit of a side-track. For the Bayesian interpretation of probability, it’s important to be able to assign a prior probability to any event (since otherwise you can’t calculate the posterior probability, given some piece of evidence that makes the event more or less probable). They do this using, e.g. the much contested principle of indifference. Some people object to this, and argue along your lines that it’s just silly to ascribe probabilities to events we know nothing about. Indeed, the frequentists define an event’s probability as the limit of its relative frequency in a large number of trials. Hence, to them, we can’t ascribe a probability to a one-off event at all.
Hence there is a huge discussion on this already and I don’t think that it’s meaningful for us to address it here. Anyway, you do have a point that one should be a bit cautious ascribing definite probabilities to events we know very little about. An alternative can be to say that the probability is somewhere in the interval from x to y, where x and y are some real numbers betwen 0 and 1.
I agree that it is largely off-topic and don’t feel like discussing it further here—I would like to point out that the principle of indifference specifies that your list of possibilities must be mutually exclusive and exhaustive. In practice, when dealing with multifaceted things such as claims about the effects of changing the minimum wage, an exhaustive list of possible outcomes would result in an assignment of an arbitrarily small probability according to the principle of indifference. The end effect is that it’s a meaningless assignment and you may as well ignore it.