If we distinguish between “previous information” and “new information”, than yes. In this case, the OP made no such distinction, so I can only assume his use of “prior” means “previous to all information we know” (I is nothing—an uninformative prior).
By the way, I don’t really see a problem with starting with a prior of 0.5 for “the lady next door is a witch” (which could be formalized as “the lady next door has powers of action at a distance that break the laws of physics as we know them”) - more generally, it’s reasonable to have a prior of 50% for “the lady next door is a flebboogy”, and then update that probability based on what information you have on flebboogies (for example, if despite intensive search, nobody has been able to prove the existence of a single flebboogy, your probability will fall quite low).
However taking a 50% probability of “she did it” (assuming only one person did “it”) wouldn’t be a good prior, a better one would be a probability of 1/N for each human, where N is the number of humans on earth. Again, this estimate will wildly vary as you take more information into account, going up if the person that did “it” must have been nearby (and not a teenager in Sri Lanka), going down if she couldn’t possibly have done “it” without supernatural powers.
Anyway, I don’t think the OP was really asking about priors, he probably meant “how do we estimate the probability that a given war is just”.
Of course you don’t have to make use of it, you can use any numbers you want, but you can’t assign a prior of 0.5 to any proposition without ending up with inconsistency. To take an example that is more detached from reality—there is a natural number N you know nothing about. You can construct whatever prior probability distribution you want for it. However, you can’t just assign 0.5 for any possible property of N (for example, P(N10)=0.5).
There can also be problems with using priors based on complexity; for example the predicates “the number, executed as a computer program, will halt” and “the number, executed as a computer program, will not halt” are both quite complex, but are mutually exclusive, so priors of 50% for each seems reasonable.
Assigning 0.5 for any possible property of N is reasonable as long as you don’t know anything else about those properties—if in addition you know some are mutually exclusive (like in your example), you can update your probabilities in consequence. But in any case, the complexity of the description of the property can’t help us choose a prior.
If we distinguish between “previous information” and “new information”, than yes. In this case, the OP made no such distinction, so I can only assume his use of “prior” means “previous to all information we know” (I is nothing—an uninformative prior).
By the way, I don’t really see a problem with starting with a prior of 0.5 for “the lady next door is a witch” (which could be formalized as “the lady next door has powers of action at a distance that break the laws of physics as we know them”) - more generally, it’s reasonable to have a prior of 50% for “the lady next door is a flebboogy”, and then update that probability based on what information you have on flebboogies (for example, if despite intensive search, nobody has been able to prove the existence of a single flebboogy, your probability will fall quite low).
However taking a 50% probability of “she did it” (assuming only one person did “it”) wouldn’t be a good prior, a better one would be a probability of 1/N for each human, where N is the number of humans on earth. Again, this estimate will wildly vary as you take more information into account, going up if the person that did “it” must have been nearby (and not a teenager in Sri Lanka), going down if she couldn’t possibly have done “it” without supernatural powers.
Anyway, I don’t think the OP was really asking about priors, he probably meant “how do we estimate the probability that a given war is just”.
I was refering to the idea that complex propositions should have lower prior probability.
Of course you don’t have to make use of it, you can use any numbers you want, but you can’t assign a prior of 0.5 to any proposition without ending up with inconsistency. To take an example that is more detached from reality—there is a natural number N you know nothing about. You can construct whatever prior probability distribution you want for it. However, you can’t just assign 0.5 for any possible property of N (for example, P(N10)=0.5).
On the other hand it has been argued that the prior of a hypothesis does not depend on its complexity.
There can also be problems with using priors based on complexity; for example the predicates “the number, executed as a computer program, will halt” and “the number, executed as a computer program, will not halt” are both quite complex, but are mutually exclusive, so priors of 50% for each seems reasonable.
Assigning 0.5 for any possible property of N is reasonable as long as you don’t know anything else about those properties—if in addition you know some are mutually exclusive (like in your example), you can update your probabilities in consequence. But in any case, the complexity of the description of the property can’t help us choose a prior.