How do you know something about the conjunction? Have you manufactured evidence from a vacuum?
I don’t think I am presuming them independent, I am merely stating that I have no information to favour a positive or negative correlation.
Look at it another way, suppose A and B are claims that I know nothing about. Then I also know nothing about A&B, A&(~B), (~A)&B and (~A)&(~B) (knowledge about any one of those would constitute knowledge about A and B). I do not think I can consistently hold that those four claims all have probability 0.5.
Since A=B is a possibility the uses of “two things” here is bit specious. You’re basically saying you know A&B but that could stand for anything at all.
Yeah, and I know that A is the disjunction of A&B and A&(~B), and that it is the negation of the negation of a proposition I know nothing about, and lots of other things. If we reading a statement and analysing its logical consequences to count as knowledge then we know infinitely many things about everything.
In that case it’s clear where we disagree because I think we are completely justified in assuming independence of any two unknown propositions. Intuitively speaking, dependence is hard. In the space of all propositions the number of dependent pairs of propositions is insignificant compared to the number of independent pairs. But if it so happens that the two propositions are not independent then I think we’re saved by symmetry.
There are a number of different combinations of A and ~A and B and ~B but I think that their conditional “biases” all cancel each other out. We just don’t know if we’re dealing with A or with ~A, with B or with ~B. If for every bias there is an equal and opposite bias, to paraphrase Newton, then I think the independence assumption must hold.
Suppose you are handed three closed envelopes each containing a concealed proposition. Without any additional information I think we have no choice but to assign each unknown proposition probability 0.5. If you then open the third envelope and if it reads “envelope-A & envelope-B” then the probability of that proposition changes to 0.25 and the other two stay at 0.5.
If not 0.25, then which number do you think is correct?
Okay, in that case I guess I would agree with you, but it seems a rather vacuous scenario. In real life you are almost never faced with the dilemma of having to evaluate the probability of a claim without even knowing what that claim is, it appears in this case that when you assign a probability of 0.5 to an envelope you are merely assigning 0.5 probability to the claim that “whoever filled this envelope decided to put a true statement in”.
When, as in almost all epistemological dilemmas, you can actually look at the claim you are evaluating, then even if you know nothing about the subject area you should still be able to tell a conjunction from a disjunction. I would never, ever apply the 0.5 rule to an actual political discussion, for example, where almost all propositions are large logical compounds in disguise.
This can’t be right. An unspecified hypothesis can be as many sentence letters and operators as you like, we still don’t have any information about it’s content and so can’t have any P other than 0.5. Take any well-formed formula in propositional logic. You can make that formula say anything you want by the way you assign semantic content to the sentence letters (for propositional logical, not the predicate calculus where can specify indpendence). We have conventions where we don’t do silly things like say “A AND ~B” and then have B come out semantically equivalent to ~A. It is also true that two randomly chosen hypotheses from a large set of mostly independent hypotheses are likely to be independent. But this is a judgment that requires knowing something about the hypothesis: which we don’t, by stipulation. Note, it isn’t just causal dependence we’re worried about here: for all we know A and B are semantically identical. By stipulation we know nothing about the system we’re modeling- the ‘space of all propositions’ could be very small.
The answer for all three envelopes is, in the case of complete ignorance, 0.5.
I think I agree completely with all of that. My earlier post was meant as an illustration that once you say C = A & B that you’re no longer dealing with a state of complete ignorance. You’re in complete ignorance of A and B, but not of C. In fact, C is completely defined as being the conjunction of A and B. I used the illustration of an envelope because as long as the envelope is closed you’re completely ignorant about its contents (by stipulation) but once you open it that’s no longer the case.
The answer for all three envelopes is, in the case of complete ignorance, 0.5.
So the probability that all three envelopes happen to contain a true hypothesis/proposition is 0.125 based on the assumption of independence. Since you said “mostly independent” does that mean you think we’re not allowed to assume complete independence? If the answer isn’t 0.125, what is it?
edit:
If your answer to the above is “still 0.5” then I have another scenario. You’re in total ignorance of A. B denotes the probability of rolling a a 6 on a regular die. What’s the probability that A & B are true? I’d say it has to be 1⁄12, even though it’s possible that A and B are not independent.
If you don’t know what A is and you don’t know what B is and C is the conjunction of A and B, then you don’t know what C is. This is precisely because, one cannot assume the independence of A and B. If you stipulate independence then you are no longer operating under conditions of complete ignorance. Strict, non-statistical independence can be represented as A!=B. A!=B tells you something about the hypothesis- its a fact about the hypothesis that we didn’t have in complete ignorance. This lets us give odds other than 1:1. See my comment here.
With regard to the scenario in the edit, the probability of A & B is 1⁄6 because we don’t know anything about independence. Now, you might say: “Jack, what are the chances A is dependent on B?! Surely most cases will involve A being something that has nothing to do with dice, much less something closely related to the throw of that particular dice.” But this kind of reasoning involves presuming things about the domain A purports to describe. The universe is really big and complex so we know there are lots of physical events A could conceivably describe. But what if the universe consisted only of one regular die that rolls once! If that is the only variable then A will =B. That we don’t live in such a universe or that this universe seems odd or unlikely are reasonable assumptions only because they’re based on our observations. But in the case of complete ignorance, by stipulation, we have no such observations. By definition, if you don’t know anything about A then you can’t know more about A&B then you know about B.
Complete ignorance just means 0.5, its just necessarily the case that when one specifies the hypothesis one provides analytic insight into the hypothesis which can easily change the probability. That is, any hypothesis that can be distinguished from an alternative hypothesis will give us grounds for ascribing a new probability to that hypothesis (based on the information used to distinguish it from alternative hypotheses).
Thanks for the explanation, that helped a lot. I expected you to answer 0.5 in the second scenario, and I thought your model was that total ignorance “contaminated” the model such that something + ignorance = ignorance. Now I see this is not what you meant. Instead it’s that something + ignorance = something. And then likewise something + ignorance + ignorance = something according to your model.
The problem with your model is that it clashes with my intuition (I can’t find fault with your arguments). I describe one such scenario here.
My intuition is that the probability of these two statements should not be the same:
A. “In order for us to succeed one of 12 things need to happen”
B. “In order for us to succeed all of these 12 things need to happen”
In one case we’re talking about a disjunction of 12 unknowns and in the second scenario we’re talking about a conjunction. Even if some of the “things” are not completely uncorrelated that shouldn’t affect the total estimate that much. My intuition is that saying P(A) = 1 − 0.5 ^ 12 and P(B) = 0.5 ^ 12. Worlds apart! As far as I can tell you would say that in both cases the best estimate we can make is 0.5. I introduce the assumption of independence (I don’t stipulate it) to fix this problem. Otherwise the math would lead me down a path that contradicts common sense.
How do you know something about the conjunction? Have you manufactured evidence from a vacuum?
I don’t think I am presuming them independent, I am merely stating that I have no information to favour a positive or negative correlation.
Look at it another way, suppose A and B are claims that I know nothing about. Then I also know nothing about A&B, A&(~B), (~A)&B and (~A)&(~B) (knowledge about any one of those would constitute knowledge about A and B). I do not think I can consistently hold that those four claims all have probability 0.5.
If you know nothing about A and B, then you know something about A&B. You know it is the conjunction of two things you know nothing about.
Since A=B is a possibility the uses of “two things” here is bit specious. You’re basically saying you know A&B but that could stand for anything at all.
You know that either A and B are highly correlated (one way or the other) or P(A&B) is close to P(A) P(B).
Yeah, and I know that A is the disjunction of A&B and A&(~B), and that it is the negation of the negation of a proposition I know nothing about, and lots of other things. If we reading a statement and analysing its logical consequences to count as knowledge then we know infinitely many things about everything.
In that case it’s clear where we disagree because I think we are completely justified in assuming independence of any two unknown propositions. Intuitively speaking, dependence is hard. In the space of all propositions the number of dependent pairs of propositions is insignificant compared to the number of independent pairs. But if it so happens that the two propositions are not independent then I think we’re saved by symmetry.
There are a number of different combinations of A and ~A and B and ~B but I think that their conditional “biases” all cancel each other out. We just don’t know if we’re dealing with A or with ~A, with B or with ~B. If for every bias there is an equal and opposite bias, to paraphrase Newton, then I think the independence assumption must hold.
Suppose you are handed three closed envelopes each containing a concealed proposition. Without any additional information I think we have no choice but to assign each unknown proposition probability 0.5. If you then open the third envelope and if it reads “envelope-A & envelope-B” then the probability of that proposition changes to 0.25 and the other two stay at 0.5.
If not 0.25, then which number do you think is correct?
Okay, in that case I guess I would agree with you, but it seems a rather vacuous scenario. In real life you are almost never faced with the dilemma of having to evaluate the probability of a claim without even knowing what that claim is, it appears in this case that when you assign a probability of 0.5 to an envelope you are merely assigning 0.5 probability to the claim that “whoever filled this envelope decided to put a true statement in”.
When, as in almost all epistemological dilemmas, you can actually look at the claim you are evaluating, then even if you know nothing about the subject area you should still be able to tell a conjunction from a disjunction. I would never, ever apply the 0.5 rule to an actual political discussion, for example, where almost all propositions are large logical compounds in disguise.
This can’t be right. An unspecified hypothesis can be as many sentence letters and operators as you like, we still don’t have any information about it’s content and so can’t have any P other than 0.5. Take any well-formed formula in propositional logic. You can make that formula say anything you want by the way you assign semantic content to the sentence letters (for propositional logical, not the predicate calculus where can specify indpendence). We have conventions where we don’t do silly things like say “A AND ~B” and then have B come out semantically equivalent to ~A. It is also true that two randomly chosen hypotheses from a large set of mostly independent hypotheses are likely to be independent. But this is a judgment that requires knowing something about the hypothesis: which we don’t, by stipulation. Note, it isn’t just causal dependence we’re worried about here: for all we know A and B are semantically identical. By stipulation we know nothing about the system we’re modeling- the ‘space of all propositions’ could be very small.
The answer for all three envelopes is, in the case of complete ignorance, 0.5.
I think I agree completely with all of that. My earlier post was meant as an illustration that once you say C = A & B that you’re no longer dealing with a state of complete ignorance. You’re in complete ignorance of A and B, but not of C. In fact, C is completely defined as being the conjunction of A and B. I used the illustration of an envelope because as long as the envelope is closed you’re completely ignorant about its contents (by stipulation) but once you open it that’s no longer the case.
So the probability that all three envelopes happen to contain a true hypothesis/proposition is 0.125 based on the assumption of independence. Since you said “mostly independent” does that mean you think we’re not allowed to assume complete independence? If the answer isn’t 0.125, what is it?
edit:
If your answer to the above is “still 0.5” then I have another scenario. You’re in total ignorance of A. B denotes the probability of rolling a a 6 on a regular die. What’s the probability that A & B are true? I’d say it has to be 1⁄12, even though it’s possible that A and B are not independent.
If you don’t know what A is and you don’t know what B is and C is the conjunction of A and B, then you don’t know what C is. This is precisely because, one cannot assume the independence of A and B. If you stipulate independence then you are no longer operating under conditions of complete ignorance. Strict, non-statistical independence can be represented as A!=B. A!=B tells you something about the hypothesis- its a fact about the hypothesis that we didn’t have in complete ignorance. This lets us give odds other than 1:1. See my comment here.
With regard to the scenario in the edit, the probability of A & B is 1⁄6 because we don’t know anything about independence. Now, you might say: “Jack, what are the chances A is dependent on B?! Surely most cases will involve A being something that has nothing to do with dice, much less something closely related to the throw of that particular dice.” But this kind of reasoning involves presuming things about the domain A purports to describe. The universe is really big and complex so we know there are lots of physical events A could conceivably describe. But what if the universe consisted only of one regular die that rolls once! If that is the only variable then A will =B. That we don’t live in such a universe or that this universe seems odd or unlikely are reasonable assumptions only because they’re based on our observations. But in the case of complete ignorance, by stipulation, we have no such observations. By definition, if you don’t know anything about A then you can’t know more about A&B then you know about B.
Complete ignorance just means 0.5, its just necessarily the case that when one specifies the hypothesis one provides analytic insight into the hypothesis which can easily change the probability. That is, any hypothesis that can be distinguished from an alternative hypothesis will give us grounds for ascribing a new probability to that hypothesis (based on the information used to distinguish it from alternative hypotheses).
Thanks for the explanation, that helped a lot. I expected you to answer 0.5 in the second scenario, and I thought your model was that total ignorance “contaminated” the model such that something + ignorance = ignorance. Now I see this is not what you meant. Instead it’s that something + ignorance = something. And then likewise something + ignorance + ignorance = something according to your model.
The problem with your model is that it clashes with my intuition (I can’t find fault with your arguments). I describe one such scenario here.
My intuition is that the probability of these two statements should not be the same:
A. “In order for us to succeed one of 12 things need to happen”
B. “In order for us to succeed all of these 12 things need to happen”
In one case we’re talking about a disjunction of 12 unknowns and in the second scenario we’re talking about a conjunction. Even if some of the “things” are not completely uncorrelated that shouldn’t affect the total estimate that much. My intuition is that saying P(A) = 1 − 0.5 ^ 12 and P(B) = 0.5 ^ 12. Worlds apart! As far as I can tell you would say that in both cases the best estimate we can make is 0.5. I introduce the assumption of independence (I don’t stipulate it) to fix this problem. Otherwise the math would lead me down a path that contradicts common sense.