Let’s say my p(intelligent ancestor) is 0.1. Imagine I have a friend, Richard, who disagrees.
No wait, the order of these two things matters. Is P(intelligent ancestor|just my background information) = 0.1 or is P(intelligent ancestor|my background information + the fact that Richard disagrees) = 0.1? I agree that if the latter holds, conservation of expected evidence comes into play and gives the conclusion you assert. But the former doesn’t imply the latter.
I agree that the order matters, and I should have discussed that in the post, but I think the conclusion will hold either way. In the case where P(intelligent ancestor|just my background information) = 0.1, and I learn that Richard disagrees, the probability then goes above 0.1. But then when I learn that Richard’s argument is bad it goes back down. And I think it should still go below 0.1, assuming you antecedently knew that there were some smart people who disagreed. You’ve learned that, for at least some smart intelligent ancestor believers, the arguments were worse than you expected.
No wait, the order of these two things matters. Is P(intelligent ancestor|just my background information) = 0.1 or is P(intelligent ancestor|my background information + the fact that Richard disagrees) = 0.1? I agree that if the latter holds, conservation of expected evidence comes into play and gives the conclusion you assert. But the former doesn’t imply the latter.
I agree that the order matters, and I should have discussed that in the post, but I think the conclusion will hold either way. In the case where P(intelligent ancestor|just my background information) = 0.1, and I learn that Richard disagrees, the probability then goes above 0.1. But then when I learn that Richard’s argument is bad it goes back down. And I think it should still go below 0.1, assuming you antecedently knew that there were some smart people who disagreed. You’ve learned that, for at least some smart intelligent ancestor believers, the arguments were worse than you expected.