This is a bit of a side-track. For the Bayesian interpretation of probability, it’s important to be able to assign a prior probability to any event (since otherwise you can’t calculate the posterior probability, given some piece of evidence that makes the event more or less probable). They do this using, e.g. the much contested principle of indifference. Some people object to this, and argue along your lines that it’s just silly to ascribe probabilities to events we know nothing about. Indeed, the frequentists define an event’s probability as the limit of its relative frequency in a large number of trials. Hence, to them, we can’t ascribe a probability to a one-off event at all.
Hence there is a huge discussion on this already and I don’t think that it’s meaningful for us to address it here. Anyway, you do have a point that one should be a bit cautious ascribing definite probabilities to events we know very little about. An alternative can be to say that the probability is somewhere in the interval from x to y, where x and y are some real numbers betwen 0 and 1.
I agree that it is largely off-topic and don’t feel like discussing it further here—I would like to point out that the principle of indifference specifies that your list of possibilities must be mutually exclusive and exhaustive. In practice, when dealing with multifaceted things such as claims about the effects of changing the minimum wage, an exhaustive list of possible outcomes would result in an assignment of an arbitrarily small probability according to the principle of indifference. The end effect is that it’s a meaningless assignment and you may as well ignore it.
This is a bit of a side-track. For the Bayesian interpretation of probability, it’s important to be able to assign a prior probability to any event (since otherwise you can’t calculate the posterior probability, given some piece of evidence that makes the event more or less probable). They do this using, e.g. the much contested principle of indifference. Some people object to this, and argue along your lines that it’s just silly to ascribe probabilities to events we know nothing about. Indeed, the frequentists define an event’s probability as the limit of its relative frequency in a large number of trials. Hence, to them, we can’t ascribe a probability to a one-off event at all.
Hence there is a huge discussion on this already and I don’t think that it’s meaningful for us to address it here. Anyway, you do have a point that one should be a bit cautious ascribing definite probabilities to events we know very little about. An alternative can be to say that the probability is somewhere in the interval from x to y, where x and y are some real numbers betwen 0 and 1.
I agree that it is largely off-topic and don’t feel like discussing it further here—I would like to point out that the principle of indifference specifies that your list of possibilities must be mutually exclusive and exhaustive. In practice, when dealing with multifaceted things such as claims about the effects of changing the minimum wage, an exhaustive list of possible outcomes would result in an assignment of an arbitrarily small probability according to the principle of indifference. The end effect is that it’s a meaningless assignment and you may as well ignore it.