Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
Of course, you could decide it is false but lie about it, but people have a hard time doing that.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
I am not sure what is the point that you are making.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
I’m talking out of my ass now (I’m not sure if you are, have you lived in or studied the culture?) but I suspect the truth is somewhere in the middle.
If you truly came to resent the culture and beliefs of those around you, and didn’t compartmentalize, I suspect two things would happen:
You would be incredibly unhappy.
Others would be pick up on your contempt, and it would be harder to make friends.
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Yes. Maybe not 100%, but 75-95%.
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
Agreed. This is what I was saying about compartmentalization.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.