IHe’s explaining the process of compartmentalization. I suspect if he had to bet on it for the background of a scientific fact, he would choose option A, but if he were discussing with a Rabi, he would choose to option B… he’s reallly just choosing which compartment of belief to draw from.
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
A religion can be seen as a metaphor, or as a way of organizing or prioritizing thoughts, but saying “Truth is within our minds” is, well, false.
That reminds me I’m planning a post on “higher truths” in literature. The short version is that I think that when people talk about “higher truths”, they really mean “things fuzzy enough that I can twist them to say whatever I like.”
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
Of course, you could decide it is false but lie about it, but people have a hard time doing that.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
I am not sure what is the point that you are making.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.
“It is a different view of the world, a different way of looking at the world. That’s why I prefaced my answer to your question with the story about the roundness of the world being one way of viewing the world. An evolutionary geological perspective is one way of viewing the world. A different way is with the six days of creation. Truth is in our minds. If we are sufficiently broad-minded, then we can simultaneously entertain different ideas of truth, different models, different views of the world.”
I don’t think he’s talking about separating beliefs into true ones and false ones.
The point of his discussion of the roundness of the world is that in order to say that, we are idealizing and approximating. Idealizing and approximating are things that happen in the mind, not in the world; that is why he says that “truth is in the mind,” and in that respect he is right, even if truth is in things in another way.
Obviously the world was not formed in six days even in the way it is round. You cannot simply idealize and approximate in the same way and get that result. Aumann is aware of this, since otherwise he wouldn’t say that you need a different way of looking at the world; you could look at it in the same way, as young earth creationists do. That makes it clear that he does not accept a literal six days; if he did, he wouldn’t say you need a different way to look at things.
He was making a comparison. Idealizing and approximating are a scientific way to make statements about the world. Metaphor is another way, and that’s what he was talking about. But metaphor is not simply about making false statements, so separating literal and metaphorical statements is not simply dividing between true and false.
IHe’s explaining the process of compartmentalization. I suspect if he had to bet on it for the background of a scientific fact, he would choose option A, but if he were discussing with a Rabi, he would choose to option B… he’s reallly just choosing which compartment of belief to draw from.
He agreed to disagree with himself. :)
So there is free money to be had by posing as a rabbi and offering a bet to Robert Aumann?
I suppose his compartments might get a bit confused at that point, but the scientific one would win out :).
Try it :-P
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
A religion can be seen as a metaphor, or as a way of organizing or prioritizing thoughts, but saying “Truth is within our minds” is, well, false.
That reminds me I’m planning a post on “higher truths” in literature. The short version is that I think that when people talk about “higher truths”, they really mean “things fuzzy enough that I can twist them to say whatever I like.”
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
I’m talking out of my ass now (I’m not sure if you are, have you lived in or studied the culture?) but I suspect the truth is somewhere in the middle.
If you truly came to resent the culture and beliefs of those around you, and didn’t compartmentalize, I suspect two things would happen:
You would be incredibly unhappy.
Others would be pick up on your contempt, and it would be harder to make friends.
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Yes. Maybe not 100%, but 75-95%.
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
Agreed. This is what I was saying about compartmentalization.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.
This is what he says:
“It is a different view of the world, a different way of looking at the world. That’s why I prefaced my answer to your question with the story about the roundness of the world being one way of viewing the world. An evolutionary geological perspective is one way of viewing the world. A different way is with the six days of creation. Truth is in our minds. If we are sufficiently broad-minded, then we can simultaneously entertain different ideas of truth, different models, different views of the world.”
I don’t think he’s talking about separating beliefs into true ones and false ones.
The point of his discussion of the roundness of the world is that in order to say that, we are idealizing and approximating. Idealizing and approximating are things that happen in the mind, not in the world; that is why he says that “truth is in the mind,” and in that respect he is right, even if truth is in things in another way.
Obviously the world was not formed in six days even in the way it is round. You cannot simply idealize and approximate in the same way and get that result. Aumann is aware of this, since otherwise he wouldn’t say that you need a different way of looking at the world; you could look at it in the same way, as young earth creationists do. That makes it clear that he does not accept a literal six days; if he did, he wouldn’t say you need a different way to look at things.
He was making a comparison. Idealizing and approximating are a scientific way to make statements about the world. Metaphor is another way, and that’s what he was talking about. But metaphor is not simply about making false statements, so separating literal and metaphorical statements is not simply dividing between true and false.