I’m interested in the power of your belief. For example, I believe strongly that, say, Michael Vassar is smart. I also believe strongly that the laws of physics hold everywhere. If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it’s more powerful.
In that vein, I would like to take some of your time to ask you to come up with a quick power ranking of some of your deep beliefs. If your religion came into direct conflict with your faith, say? (I am not sure this is a fair question, actually—I personally can’t imagine what would happen if my rationality came into conflict with my sense of truth, because they’re so similar).
I’m interested in the power of your belief. For example, I believe strongly that, say, Michael Vassar is smart. I also believe strongly that the laws of physics hold everywhere. If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it’s more powerful.
Your concept of the power of a belief sounds a lot like its probability.
That’s because it is. Yes, the way I described power rankings working, it is isomorphic to this:
Bayesian agent has two beliefs X and Y. If it discovered that X and Y are evidence against each other ( Pr(X | Y) < Pr(X) & Pr(Y | X) < Pr(Y) ) which belief will be updated more?
which is isomorphic to
How much evidence for X and how much for Y?
but those questions don’t cause most human brains to give good answers.
I think that thinking in terms of probability is going to be more conducive to careful thinking instead of thinking in terms of power. We’ve got a lot of emotional connections and alternative definitions for the second word which we don’t really want interfering with our reasoning when we speak of probability.
I kinda disagree here. If you show me an exact Bayesian network, I can read off it the degree to which evidence for one proposition is evidence against another. If you don’t give an exact interpretation in probability theory, then isn’t talking about “probability” instead of “power” just pretending to precision? Jumping to “probability” is something that has to be earned, and to me it’s not yet obvious that for all Bayesian graphs, if P(A) > P(B) > 0.5, then learning the truth of a descendant node which proves !(A & B) will cause B to decrease in probability more than A.
and to me it’s not yet obvious that for all Bayesian graphs, if P(A) > P(B) > 0.5, then learning the truth of a descendant node which proves !(A & B) will cause B to decrease in probability more than A.
The tradeoff occurring here seems to be reducing the possibility of triggering biases versus reducing the possibility that you’re fooling yourself into thinking that you’re thought is more precise than it really is. I would go with the first; if I felt that I was being insufficiently precise in a certain situation, I could use a couple checks, such as seeing whether it managed to distinguish fiction from reality effectively.
On a more concrete note, I read this:
If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it’s more powerful.
as judging that if he estimated P(A)>P(B), P(A) would remain greater than P(B) given !(A&B), not as saying that !(A&B) was stronger evidence against B than against A.
I make a few presumptions here; correct me if I’m wrong.
I presume you do not simply have total faith in everything Latter Day Saints; you don’t experience a sense of rightness on every single line of every single religious text (I’ve never met a religious person who does; this is something that only happens in strawman atheism arguments). But presumably you also have experienced a sense of rightness regarding some large part of LDS theology (again, based off my experiences with religious people), as that would be why you converted.
Now here’s the tricky part. If you read something that struck you as right—you got that sense of rightness about it—but when you shared it you found it was directly contradicting some doctrine of LDS, what would happen? Would you stop thinking the thing was right, or would you adjust your view of the LDS Church slight downwards?
(The reason I am not sure this is fair is because if you asked me the same question in terms of rationality and truth-feeling, I would have a hard time not picking it apart, although in the least convenient possible world I would closely examine both my rationality and my feeling of truthness, and then rationality would win.)
I’m interested in the power of your belief. For example, I believe strongly that, say, Michael Vassar is smart. I also believe strongly that the laws of physics hold everywhere. If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it’s more powerful.
In that vein, I would like to take some of your time to ask you to come up with a quick power ranking of some of your deep beliefs. If your religion came into direct conflict with your faith, say? (I am not sure this is a fair question, actually—I personally can’t imagine what would happen if my rationality came into conflict with my sense of truth, because they’re so similar).
Your concept of the power of a belief sounds a lot like its probability.
That’s because it is. Yes, the way I described power rankings working, it is isomorphic to this:
which is isomorphic to
but those questions don’t cause most human brains to give good answers.
I think that thinking in terms of probability is going to be more conducive to careful thinking instead of thinking in terms of power. We’ve got a lot of emotional connections and alternative definitions for the second word which we don’t really want interfering with our reasoning when we speak of probability.
I kinda disagree here. If you show me an exact Bayesian network, I can read off it the degree to which evidence for one proposition is evidence against another. If you don’t give an exact interpretation in probability theory, then isn’t talking about “probability” instead of “power” just pretending to precision? Jumping to “probability” is something that has to be earned, and to me it’s not yet obvious that for all Bayesian graphs, if P(A) > P(B) > 0.5, then learning the truth of a descendant node which proves !(A & B) will cause B to decrease in probability more than A.
Consider learning “not A,” for example.
The tradeoff occurring here seems to be reducing the possibility of triggering biases versus reducing the possibility that you’re fooling yourself into thinking that you’re thought is more precise than it really is. I would go with the first; if I felt that I was being insufficiently precise in a certain situation, I could use a couple checks, such as seeing whether it managed to distinguish fiction from reality effectively.
On a more concrete note, I read this:
as judging that if he estimated P(A)>P(B), P(A) would remain greater than P(B) given !(A&B), not as saying that !(A&B) was stronger evidence against B than against A.
Confused. What do you mean exactly? (Did you mean to type ‘your reason’? Or something else?)
I make a few presumptions here; correct me if I’m wrong.
I presume you do not simply have total faith in everything Latter Day Saints; you don’t experience a sense of rightness on every single line of every single religious text (I’ve never met a religious person who does; this is something that only happens in strawman atheism arguments). But presumably you also have experienced a sense of rightness regarding some large part of LDS theology (again, based off my experiences with religious people), as that would be why you converted.
Now here’s the tricky part. If you read something that struck you as right—you got that sense of rightness about it—but when you shared it you found it was directly contradicting some doctrine of LDS, what would happen? Would you stop thinking the thing was right, or would you adjust your view of the LDS Church slight downwards?
(The reason I am not sure this is fair is because if you asked me the same question in terms of rationality and truth-feeling, I would have a hard time not picking it apart, although in the least convenient possible world I would closely examine both my rationality and my feeling of truthness, and then rationality would win.)