Would you believe her if she said God was talking to her?
Aumann s theorem can apply in reality, when the “boilerplate” conditions are approximately met, when there is some mutual trust It still doesn’t apply across deep ideological divides, because people with strongly different object level beliefs don’t trust other[*]. And of course , those situations are where it would be philosophically significant. So the boilerplate does matter.
[*] Actually, Yudkowsky was inclined to distrust Aumann’s rationality because of his theism.
It’s an important challenge, and if we generalize to ideologies in general rather than focusing uniquely on a person who says God is talking to her, it’s something I’ve thought a bunch about.
I think most of the incompatible beliefs people come up with are not directly from people’s own experiences, but rather from Aumann-agreeing with other members of the ideologies who push those ideas. This isn’t specific to false ideologies; it also applies to e.g. evolution where it took most of human history until Charles Darwin came up with the idea and principles for evolution. It’s not something most people have derived for themselves from first principles.
So I tend to think of the object-level differences as being heavily originating in the differences in who one is Aumanning, and Aumann’s theorem suggest that agreeing is dependent on trust, so I see ideologies as being constituted by networks of trust between people, where ideas can flow within the networks due to the trust, but they might not flow between the networks due to people not trusting each other there.
The case of someone saying that God is talking to her is somewhat different from this, since it is a personal experience and since it is not based on ideological trust (in fact I am under the impression that a lot of religions would agree that you are crazy if you think God is talking to you?).
I have once encountered this sort of situation—someone close to me who was very mentally ill and had been on lots of psychiatric medications started seeing God in many places and talked about how God wanted her to eat spiders and that the local equivalent of the CIA was spying on her. I think the establishment medical ideology calls this phenomenon “psychosis”, and claims that it is due to a brain malfunction. The context and general outcome of this event (where she definitely didn’t save the world despite supposedly having God as an advisor) would make me prone to agreeing with that assessment. So while I don’t agree that God was actually talking to her, I do agree that she had the perception as if God was talking to her, and that it was just because the rationality assumption in Aumann’s theorem was failing that I shouldn’t update on it more generally.
Going back to the case of ideologies, I think a few different things are happening. First, ideological networks can embed rationality failures like the above or honesty failures deep into the ideological network, and allow them to spread their ideas to others, corrupting the entire network. Especially if the flawed beliefs they are spreading are sufficiently abstract, they might not have any good way of getting noticed and corrected.
This is not limited to religion; science bundles accurate beliefs like Darwinian evolution together with inaccurate beliefs like that IQ test scores don’t depend on test-taking effort. This propensity of science to spread tons of falsehoods suggests that one should not trust science too much. But that also makes it difficult to untangle what should be believed from what shouldn’t be believed.
This is getting a bit long and rambly so I’ll end my comment here so I can hear what you think in response to this.
I think most of the incompatible beliefs people come up with are not directly from people’s own experiences, but rather from Aumann-agreeing with other members of the ideologies who push those ideas.
It’s trust rather than trust in rationality. There’s very strong evidence that people get most of their beliefs from their social background, but explicitly irrational ideologoies operate the same way, so there’s little evidence that social trust is an Aumann mechanism.
Rationality has this thing where it does ignore the “boilerplate”, the annoying details, in multiple cases. That leads to making claims that are too broad—or diluting the meanings of terms: it;’s often hard to say which. Bayesian probability is treated as some kind of probablistic reasoning, not necessarily qualitative; Aumann’s theroem just means reasonable people should agree, etc.
Would you believe her if she said God was talking to her?
Aumann s theorem can apply in reality, when the “boilerplate” conditions are approximately met, when there is some mutual trust It still doesn’t apply across deep ideological divides, because people with strongly different object level beliefs don’t trust other[*]. And of course , those situations are where it would be philosophically significant. So the boilerplate does matter.
[*] Actually, Yudkowsky was inclined to distrust Aumann’s rationality because of his theism.
It’s an important challenge, and if we generalize to ideologies in general rather than focusing uniquely on a person who says God is talking to her, it’s something I’ve thought a bunch about.
I think most of the incompatible beliefs people come up with are not directly from people’s own experiences, but rather from Aumann-agreeing with other members of the ideologies who push those ideas. This isn’t specific to false ideologies; it also applies to e.g. evolution where it took most of human history until Charles Darwin came up with the idea and principles for evolution. It’s not something most people have derived for themselves from first principles.
So I tend to think of the object-level differences as being heavily originating in the differences in who one is Aumanning, and Aumann’s theorem suggest that agreeing is dependent on trust, so I see ideologies as being constituted by networks of trust between people, where ideas can flow within the networks due to the trust, but they might not flow between the networks due to people not trusting each other there.
The case of someone saying that God is talking to her is somewhat different from this, since it is a personal experience and since it is not based on ideological trust (in fact I am under the impression that a lot of religions would agree that you are crazy if you think God is talking to you?).
I have once encountered this sort of situation—someone close to me who was very mentally ill and had been on lots of psychiatric medications started seeing God in many places and talked about how God wanted her to eat spiders and that the local equivalent of the CIA was spying on her. I think the establishment medical ideology calls this phenomenon “psychosis”, and claims that it is due to a brain malfunction. The context and general outcome of this event (where she definitely didn’t save the world despite supposedly having God as an advisor) would make me prone to agreeing with that assessment. So while I don’t agree that God was actually talking to her, I do agree that she had the perception as if God was talking to her, and that it was just because the rationality assumption in Aumann’s theorem was failing that I shouldn’t update on it more generally.
Going back to the case of ideologies, I think a few different things are happening. First, ideological networks can embed rationality failures like the above or honesty failures deep into the ideological network, and allow them to spread their ideas to others, corrupting the entire network. Especially if the flawed beliefs they are spreading are sufficiently abstract, they might not have any good way of getting noticed and corrected.
This is not limited to religion; science bundles accurate beliefs like Darwinian evolution together with inaccurate beliefs like that IQ test scores don’t depend on test-taking effort. This propensity of science to spread tons of falsehoods suggests that one should not trust science too much. But that also makes it difficult to untangle what should be believed from what shouldn’t be believed.
This is getting a bit long and rambly so I’ll end my comment here so I can hear what you think in response to this.
It’s trust rather than trust in rationality. There’s very strong evidence that people get most of their beliefs from their social background, but explicitly irrational ideologoies operate the same way, so there’s little evidence that social trust is an Aumann mechanism.
Rationality has this thing where it does ignore the “boilerplate”, the annoying details, in multiple cases. That leads to making claims that are too broad—or diluting the meanings of terms: it;’s often hard to say which. Bayesian probability is treated as some kind of probablistic reasoning, not necessarily qualitative; Aumann’s theroem just means reasonable people should agree, etc.