So can you please explain what he means? I really don’t understand in what sense it can be said that “the world is 15 billion years old” and “the world was created by God in six days” can both be literally true. And it doesn’t sound like he means the Omphalos argument that the world was created looking old. Rather, it sounds like he’s saying that in one sense of “truth” or in one “model of the world” it really is 15 billion years old, and in another sense / model it really is young, and those two truths / models are somehow not contradictory. I just can’t seem to wrap my head around how that might make any sense.
IHe’s explaining the process of compartmentalization. I suspect if he had to bet on it for the background of a scientific fact, he would choose option A, but if he were discussing with a Rabi, he would choose to option B… he’s reallly just choosing which compartment of belief to draw from.
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
A religion can be seen as a metaphor, or as a way of organizing or prioritizing thoughts, but saying “Truth is within our minds” is, well, false.
That reminds me I’m planning a post on “higher truths” in literature. The short version is that I think that when people talk about “higher truths”, they really mean “things fuzzy enough that I can twist them to say whatever I like.”
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
Of course, you could decide it is false but lie about it, but people have a hard time doing that.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
I am not sure what is the point that you are making.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.
“It is a different view of the world, a different way of looking at the world. That’s why I prefaced my answer to your question with the story about the roundness of the world being one way of viewing the world. An evolutionary geological perspective is one way of viewing the world. A different way is with the six days of creation. Truth is in our minds. If we are sufficiently broad-minded, then we can simultaneously entertain different ideas of truth, different models, different views of the world.”
I don’t think he’s talking about separating beliefs into true ones and false ones.
The point of his discussion of the roundness of the world is that in order to say that, we are idealizing and approximating. Idealizing and approximating are things that happen in the mind, not in the world; that is why he says that “truth is in the mind,” and in that respect he is right, even if truth is in things in another way.
Obviously the world was not formed in six days even in the way it is round. You cannot simply idealize and approximate in the same way and get that result. Aumann is aware of this, since otherwise he wouldn’t say that you need a different way of looking at the world; you could look at it in the same way, as young earth creationists do. That makes it clear that he does not accept a literal six days; if he did, he wouldn’t say you need a different way to look at things.
He was making a comparison. Idealizing and approximating are a scientific way to make statements about the world. Metaphor is another way, and that’s what he was talking about. But metaphor is not simply about making false statements, so separating literal and metaphorical statements is not simply dividing between true and false.
The sentence “Frodo carried the One Ring to Mount Doom” is not literally true, but it is true within the fictional narrative of the Lord of the Rings. You can simultaneously believe it and not believe it, in a certain sense, by applying the so called “suspension of disbelief”, a mental mechanism which probably evolved to allow us to consider hypothetical conterfactual beliefs for decision making and which we then started using to make fiction.
I think that theists like Robert Aumann who support the non-overlapping magisteria position are doing something similar: they accept “the world is 15 billion years old” as an epistemic “Bayesian” belief which they use when considering expectations over observations, and they apply suspension of disbelief in order to believe “the world was created by God in six days” in the counterfactual context of religion.
World is created in 6 days, with evidence indicating 15 billion years of whatever. For that matter, earth was created one second ago, your memories included.
I really don’t understand in what sense it can be said that “the world is 15 billion years old” and “the world was created
by God in six days” can both be literally true.
Where did you get young earth creationism from above? Where did you get “6 day creation is literally true given earth days”? If this is how you are parsing Aumann why are you even talking about this?
I repeat, where did you get literal 6 days, as in 6 24 Earth hours? Isn’t there a steelman custom on LW?
I can invent lots of reasonable interpretations for the Genesis myth of creation after thinking about it for a little bit. Isn’t there a concept of “day of Brahma,” and so on? I am sure smart theologians who spend their lives on this can, too!
I think Aumann’s deeper point is about map/territory, and how we should treat modeling as more-tentative-than-currently-customary (almost everything is modeling, and since all models are false it is useful to hedge/diversify). A lot of modern science is actually conditional on “convenient” statistical models that are not easy to defend.
Yes, but there is a point where we should put our feet down.
I think Aumann’s deeper point is about map/territory, and how we should treat modeling as more-tentative-than-currently-customary (almost everything is modeling, and since all models are false it is useful to hedge/diversify).
Even a very diverse map has to bother with object-level predictions that fit the object-level territory. Religion has so far been an utter failure at doing so.
Of course, one could charge that it’s not intended to do so, and yack on about separate magisteria or compartmentalization, but in that case, bite the bullet and simply admit that words like “true” or “real”, in their everyday sense of mapping a territory, do not apply to religion.
Do you think that Aumann’s statement can only be interpreted as six 24 hour days?
Of course, one could charge that it’s not intended to do so, and yack on about separate magisteria
This is a very jarring dismissal of a very difficult to resolve problem, despite it being very old. Here are some maps that do not yield testable predictions:
-Other people exist
-Other people are conscious
-I was not created in the last minute with all of my current memories
Epistemology is much more than creating testable predictions.
Do you think that Aumann’s statement can only be interpreted as six 24 hour days?
No, it can also be interpreted in uselessly non-predictive ways.
This is a very jarring dismissal of a very difficult to resolve problem, despite it being very old.
I very much disagree.
Here are some maps that do not yield testable predictions:
-Other people exist
-Other people are conscious
-I was not created in the last minute with all of my current memories
Excuse me, but any sensible forms of those hypotheses do yield testable predictions—unless you’re confused about the meaning of words like “exist” and “conscious”. Let’s list things out:
Other people will exhibit object permanence, a consistency of state across observations. This is a simple prediction, and in fact, “other people exist” is the simplest hypothesis explaining it. Since “I am in the Matrix” is much more complex, it requires its own unique evidence to differentiate it from the simpler “other people exist”.
Other people will behave as if they can introspect on internal experiences. Again, simple prediction (though actually reasonably complex: it requires me to have a theory-of-mind), but a prediction generated by a muuuuch simpler hypothesis than “Other people are p-zombies.” In fact, if others are p-zombies and I’m not, then we’ve got a suspicious, weird uniqueness that would itself require explanation.
My memories will be consistent with present and future observations. And in fact, to be even more specific, the world will be consistent with my previous existence in ways that I don’t possess memories matching-up to: I might find my keys somewhere in my apartment where I didn’t remember leaving them, and then remember how I dropped them there yesterday night when very tired. Again, you could posit a Matrix Lord or a malevolent deity who’s deliberately faking everything you experience, but then you just need an explanation for that which still accords with no Matrix-y stuff happening (like the same black cat walking past you twice, in the same direction, in two minutes).
Please leave the philosophy-woo back in undergrad alongside your copy of Descartes’ Meditations.
From your last line, I think its unlikely that unlikely that this is going to be productive. It sounds like you think that epistemology is simply erudite nonsense and philosophers need to just accept probably Bayesianism or the scientific method or something. I think this is quite disappointing, mathematicians could have similarly dismissed attempts to ground calculus in something other than loose arguments of the form “well it works what more do you want” but we would have a much less rich and stable field as a result. But if this is a mischaracterization of your view of epistemology then please let me know.
It sounds like you think that epistemology is simply erudite nonsense
Much of it is, yes.
philosophers need to just accept probably Bayesianism or the scientific method or something.
This would require that “Bayesianism” or “the scientific method” or “or something” actually be a full, formalized solution to How to Reason Inductively. We currently possess no such solution; this does not, by any means, mean that no such solution can exist and we all have to resort to throwing intuitions at each-other or adopt broad skepticism about the existence and contents of reality.
What I recommend is to move past the trivialities, having accepted that the eventual solution will be abductive (in the sense required to dismiss skepticism about the external world or the consciousness of others as silly, which it is), and set to work on the actual details and formalizations, which are of course where all the hard work remains to be done.
(By the way, the reference to philosophy-woo is because professional epistemologists tend not to be radical skeptics. The idea that there just isn’t an external reality is mainly only taken seriously by undergrads first learning the subject.)
When I was young, there were many attempts by religious people to “reconcile” science and religion. For example, each of the six days of creation can be viewed as representing a different geological era. There was—and perhaps still is—a view that science contradicts religion, that one has to reconcile them. It is apologetic, and I don’t buy it.
He is swallowing the bullet on separate magisteria, here.
Yes.
Give it a rest already.
The antidote to cultishness is reading things by a diverse selection of smart people.
So can you please explain what he means? I really don’t understand in what sense it can be said that “the world is 15 billion years old” and “the world was created by God in six days” can both be literally true. And it doesn’t sound like he means the Omphalos argument that the world was created looking old. Rather, it sounds like he’s saying that in one sense of “truth” or in one “model of the world” it really is 15 billion years old, and in another sense / model it really is young, and those two truths / models are somehow not contradictory. I just can’t seem to wrap my head around how that might make any sense.
IHe’s explaining the process of compartmentalization. I suspect if he had to bet on it for the background of a scientific fact, he would choose option A, but if he were discussing with a Rabi, he would choose to option B… he’s reallly just choosing which compartment of belief to draw from.
He agreed to disagree with himself. :)
So there is free money to be had by posing as a rabbi and offering a bet to Robert Aumann?
I suppose his compartments might get a bit confused at that point, but the scientific one would win out :).
Try it :-P
Yes, but what is the purpose of compartmentalizing beliefs into bin A and bin B where the things in bin A are true and the things in bin B are false?
A religion can be seen as a metaphor, or as a way of organizing or prioritizing thoughts, but saying “Truth is within our minds” is, well, false.
That reminds me I’m planning a post on “higher truths” in literature. The short version is that I think that when people talk about “higher truths”, they really mean “things fuzzy enough that I can twist them to say whatever I like.”
There is a deep deep bias on LW of thinking that truth is the only aspect of belief that has value. But there’s tons of other aspects of beliefs that have value—how happy they make, how much social acceptance they get you, how useful they are in achieving your goals—an many times these things are at cross purposes with the truth.
The beauty of compartmentalization is that you may be able to get the benefits of the truth while ALSO getting these other benefits.
It’s not a bias, but an expected-value calculation. Most falsehoods are utterly useless to believe, along the lines of “The moon is made of green cheese.” Merely affecting a belief-in-belief can be useful for the vast majority of other cases without actually spoiling your own reasoning abilities by swallowing a poison pill of deliberate falsehood in the name of utility.
The issue is that without possessing complete information about your environment, you can’t actually tell, a priori, which false beliefs are harmless and which ones will lose and lose badly.
When you have a sophisticated meta-level argument for object-level wrongness, you’re losing.
If you are a Muslim in many Islamic countries today, and you decide that Islam is false, and let people know it, you can be executed. This does not seem to have a high expected value.
Of course, you could decide it is false but lie about it, but people have a hard time doing that. It is easier to convince yourself that it is true, to avoid getting killed.
It’s really not that hard, especially in countries with institutionalized religions. Just keep going to mosque, saying the prayers, obeying the norms, and you’ve got everything most believers actually do, minus the belief.
But lying your entire life, even to your children (you can’t risk teaching them anything other than the official truth) can be mentally exhausting.
Add the fact that core religious ideas seem intuitively appealing to most people, to the point that even ostensibly atheist people often end up believing in variants of them, and you get why religion is so popular.
I don’t think David was saying that lying is hard—but that lying convincingly is hard. There’s a whole bunch of non-verbal, unconscious signals we send out, that at the very least, make it seem like “something is off” when we lie.
Yes, but Islamic societies don’t actually have a Thought Police. They care about the public affectations and the obedience to norms associated with religion, not about private convictions. Honestly, do people in the West really think Arabs actually believe 100% of the nonsense they spout?
I’m talking out of my ass now (I’m not sure if you are, have you lived in or studied the culture?) but I suspect the truth is somewhere in the middle.
If you truly came to resent the culture and beliefs of those around you, and didn’t compartmentalize, I suspect two things would happen:
You would be incredibly unhappy.
Others would be pick up on your contempt, and it would be harder to make friends.
Yes, both of these happen. Also, it’s harder to be friends even with the people you already know because you feel dishonest all the time (obviously because you are in fact being dishonest with them.)
Well, they do compartmentalize. They’re also fairly unhappy, but not to a level where they’re going to disrupt their entire social sphere and be ostracized (at best) from polite society for speaking up.
(Knowledge level: have talked to an n > 7 number of Arabs from the Arab world, usually Jordanians, Palestinians and Egyptians.)
Yes. Maybe not 100%, but 75-95%.
Let me attempt to phrase this well: people in the West believe that one shows status through high-handedness, by concealing emotions and not getting riled-up. People in the Middle East believe that one shows status through bombast and braggadocio.
They mostly know the nonsense they’re spouting is nonsense, but in their minds, they’d be weak and low-status not to spout it, even in the expectation that nobody would really believe it (because it’s nonsense).
Agreed. This is what I was saying about compartmentalization.
I am not sure what is the point that you are making. Without “possessing complete information about your environment” you actually can’t tell which of your beliefs are true and which are false. Humans make do with estimates and approximations, as always.
That if you start deliberately believing false things, it’s not actually useful, it’s harmful. Expected regret almost always goes up from deliberately believing something you know to to be wrong.
This is true, but one could use some other terminology rather than abuse the word “truth”. Aumann is giving ammunition to every continental philosopher who argues that truth is relative or arbitrary, then tries to bring that into public policy.
I lose respect for Aumann for saying this. I have respect for Anders Sandberg, who in the past practiced some neo-paganism, with religious trappings, but when asked about it would explain (IIRC) that he was tricking his mind into behaving.
This is what he says:
“It is a different view of the world, a different way of looking at the world. That’s why I prefaced my answer to your question with the story about the roundness of the world being one way of viewing the world. An evolutionary geological perspective is one way of viewing the world. A different way is with the six days of creation. Truth is in our minds. If we are sufficiently broad-minded, then we can simultaneously entertain different ideas of truth, different models, different views of the world.”
I don’t think he’s talking about separating beliefs into true ones and false ones.
The point of his discussion of the roundness of the world is that in order to say that, we are idealizing and approximating. Idealizing and approximating are things that happen in the mind, not in the world; that is why he says that “truth is in the mind,” and in that respect he is right, even if truth is in things in another way.
Obviously the world was not formed in six days even in the way it is round. You cannot simply idealize and approximate in the same way and get that result. Aumann is aware of this, since otherwise he wouldn’t say that you need a different way of looking at the world; you could look at it in the same way, as young earth creationists do. That makes it clear that he does not accept a literal six days; if he did, he wouldn’t say you need a different way to look at things.
He was making a comparison. Idealizing and approximating are a scientific way to make statements about the world. Metaphor is another way, and that’s what he was talking about. But metaphor is not simply about making false statements, so separating literal and metaphorical statements is not simply dividing between true and false.
The sentence “Frodo carried the One Ring to Mount Doom” is not literally true, but it is true within the fictional narrative of the Lord of the Rings. You can simultaneously believe it and not believe it, in a certain sense, by applying the so called “suspension of disbelief”, a mental mechanism which probably evolved to allow us to consider hypothetical conterfactual beliefs for decision making and which we then started using to make fiction.
I think that theists like Robert Aumann who support the non-overlapping magisteria position are doing something similar: they accept “the world is 15 billion years old” as an epistemic “Bayesian” belief which they use when considering expectations over observations, and they apply suspension of disbelief in order to believe “the world was created by God in six days” in the counterfactual context of religion.
World is created in 6 days, with evidence indicating 15 billion years of whatever. For that matter, earth was created one second ago, your memories included.
That’s what the Omphalos argument is.
Where did you get young earth creationism from above? Where did you get “6 day creation is literally true given earth days”? If this is how you are parsing Aumann why are you even talking about this?
And a “different way is with the six days of creation.”
Now he doesn’t strictly say that he holds both simultaneously, but I think that can be implied.
I repeat, where did you get literal 6 days, as in 6 24 Earth hours? Isn’t there a steelman custom on LW?
I can invent lots of reasonable interpretations for the Genesis myth of creation after thinking about it for a little bit. Isn’t there a concept of “day of Brahma,” and so on? I am sure smart theologians who spend their lives on this can, too!
I think Aumann’s deeper point is about map/territory, and how we should treat modeling as more-tentative-than-currently-customary (almost everything is modeling, and since all models are false it is useful to hedge/diversify). A lot of modern science is actually conditional on “convenient” statistical models that are not easy to defend.
Yes, but there is a point where we should put our feet down.
Even a very diverse map has to bother with object-level predictions that fit the object-level territory. Religion has so far been an utter failure at doing so.
Of course, one could charge that it’s not intended to do so, and yack on about separate magisteria or compartmentalization, but in that case, bite the bullet and simply admit that words like “true” or “real”, in their everyday sense of mapping a territory, do not apply to religion.
I am pretty sure Aumann is biting the bulet.
Do you think that Aumann’s statement can only be interpreted as six 24 hour days?
This is a very jarring dismissal of a very difficult to resolve problem, despite it being very old. Here are some maps that do not yield testable predictions:
-Other people exist
-Other people are conscious
-I was not created in the last minute with all of my current memories
Epistemology is much more than creating testable predictions.
No, it can also be interpreted in uselessly non-predictive ways.
I very much disagree.
Excuse me, but any sensible forms of those hypotheses do yield testable predictions—unless you’re confused about the meaning of words like “exist” and “conscious”. Let’s list things out:
Other people will exhibit object permanence, a consistency of state across observations. This is a simple prediction, and in fact, “other people exist” is the simplest hypothesis explaining it. Since “I am in the Matrix” is much more complex, it requires its own unique evidence to differentiate it from the simpler “other people exist”.
Other people will behave as if they can introspect on internal experiences. Again, simple prediction (though actually reasonably complex: it requires me to have a theory-of-mind), but a prediction generated by a muuuuch simpler hypothesis than “Other people are p-zombies.” In fact, if others are p-zombies and I’m not, then we’ve got a suspicious, weird uniqueness that would itself require explanation.
My memories will be consistent with present and future observations. And in fact, to be even more specific, the world will be consistent with my previous existence in ways that I don’t possess memories matching-up to: I might find my keys somewhere in my apartment where I didn’t remember leaving them, and then remember how I dropped them there yesterday night when very tired. Again, you could posit a Matrix Lord or a malevolent deity who’s deliberately faking everything you experience, but then you just need an explanation for that which still accords with no Matrix-y stuff happening (like the same black cat walking past you twice, in the same direction, in two minutes).
Please leave the philosophy-woo back in undergrad alongside your copy of Descartes’ Meditations.
From your last line, I think its unlikely that unlikely that this is going to be productive. It sounds like you think that epistemology is simply erudite nonsense and philosophers need to just accept probably Bayesianism or the scientific method or something. I think this is quite disappointing, mathematicians could have similarly dismissed attempts to ground calculus in something other than loose arguments of the form “well it works what more do you want” but we would have a much less rich and stable field as a result. But if this is a mischaracterization of your view of epistemology then please let me know.
Much of it is, yes.
This would require that “Bayesianism” or “the scientific method” or “or something” actually be a full, formalized solution to How to Reason Inductively. We currently possess no such solution; this does not, by any means, mean that no such solution can exist and we all have to resort to throwing intuitions at each-other or adopt broad skepticism about the existence and contents of reality.
What I recommend is to move past the trivialities, having accepted that the eventual solution will be abductive (in the sense required to dismiss skepticism about the external world or the consciousness of others as silly, which it is), and set to work on the actual details and formalizations, which are of course where all the hard work remains to be done.
(By the way, the reference to philosophy-woo is because professional epistemologists tend not to be radical skeptics. The idea that there just isn’t an external reality is mainly only taken seriously by undergrads first learning the subject.)
Directly above the passage quoted in OP is this:
He is swallowing the bullet on separate magisteria, here.