Eliezer, it makes me nervous when my behavior or reasoning differs from the vast majority of human beings. Surely that’s a reasonable concern?
On this planet? No. On this planet, I think you’re better off just worrying about the object-level state of the evidence. Your visceral nervousness has nothing to do with Aumann. It is conformity.
Knowing that people are crazy and the world is mad helps a bit, but not too much because people who are even crazier than average probably explain their disagreements with the world in exactly this way.
What do you care what people who are crazier than average do? You already have enough information to know you’re not one of them. You care what these people do, not because you really truly seriously think you might be one of them, but because of the gut-level, bone-deep fear of losing status by seeming to affiliate with a low-prestige group by saying something that sounds similar to what they say. You may be reluctant to admit that you know perfectly well you’re not in this group, because that also sounds like something this low-prestige group would say; but in real life, you have enough info, you know you have enough info, and the thought has not seriously crossed your mind in a good long while, whatever your dutiful doubts of your foregone conclusion.
Seriously, just make the break, clean snap, over and done.
So, I’m inclined to try to find more detailed explanations of the differences. Is there any reason you can think of why that might be unproductive, or otherwise a bad idea?
Occam’s Imaginary Razor. Spending lots of time on the meta-level explaining away what other people think is bad for your mental health.
You’re wrong, Elizer. I am sure that I’m not crazier than average, and I’m not reluctant to admit that. But in order to disagree with most of the world, I have to have good reason to think that I’m more rational than everyone I disagree with, or have some other explanation that lets me ignore Aumann. The only reason I referred to people who are crazier than average is to explain why “people are crazy, the world is mad” is not one of those explanations.
Spending lots of time on the meta-level explaining away what other people think is bad for your mental health.
That’s only true if I’m looking for rationalizations, instead of real explanations, right? If so, noted, and I’ll try to be careful.
But in order to disagree with most of the world, I have to have good reason to think that I’m more rational than everyone I disagree with
You’re more rational than the vast majority of people you disagree with. There, I told you up front. Is that reason enough? I can understand why you’d doubt yourself, but why should you doubt me?
That’s only true if I’m looking for rationalizations, instead of real explanations, right? If so, noted, and I’ll try to be careful.
I’m not saying that you should deliberately stay ignorant or avoid thinking about it, but I suspect that some of the mental health effects of spending lots of time analyzing away other people’s disagreements would happen to you even if you miraculously zeroed in on the true answer every time. Which you won’t. So it may not be wise to deliberately invest extra thought-time here.
Or maybe divide healthy and risky as follows: Healthy is what you do when you have a serious doubt and are moving to resolve it, for example by reading more of the literature, not to fulfill a duty or prove something to yourself, but because you seriously think there may be stuff out there you haven’t read. Risky is anything you do because you want to have investigated in order to prove your own rationality to yourself, or because it would feel too immodest to just think outright that you had the right answer.
The only reason I referred to people who are crazier than average is to explain why “people are crazy, the world is mad” is not one of those explanations.
It is if you stick to the object level. Does it help if I rephrase it as “People are crazy, the world is mad, therefore everyone has to show their work”? You just shouldn’t have to spend all that much effort to suppose that a large number of people have been incompetent. It happens so frequently that if there were a Shannon code for describing Earth, “they’re nuts” would have a single-symbol code in the language. Now, if you seriously don’t know whether someone else knows something you don’t, then figure out where to look and look there. But the answer may just be “4”, which stands for Standard Explanation #4 in the Earth Description Language: “People are crazy, the world is mad”. And in that case, spending lots of effort in order to develop an elaborate dismissal of their reasons is probably not good for your mental health and will just slow you down later if it turns out they did know something else. If by a flash of insight you realize there’s a compact description of a mistake that a lot of other people are making, then this is a valuable thing to know so you can avoid it yourself; but I really think it’s important to learn how to just say “4” and move on.
I will come as a surprise to few people that I disagree strongly with Eliezer here; Wei should not take his word for the claim that Wei is so much more rational than all the folks he might disagree with that he can ignore their differing opinions. Where is this robust rationality test used to compare Wei to the rest of the intellectual world? Where is the evidence for this supposed mental health risk of considering the important evidence of the opinions of other? If the world is crazy, then very likely so are you. Yes it is a good sign if you can show some of your work, but you can almost never show all of your relevant work. So we must make inferences about the thought we have not seen.
Well, I think we both agree on the dangers of a wide variety of cheap talk—or to put it more humbly, you taught me on the subject. Though even before then, I had developed the unfortunate personal habit of calling people’s bluffs.
So while we can certainly interpret talk about modesty and immodesty in terms of rhetoric, isn’t the main testable prediction at stake, the degree to which Wei Dai should often find, on further investigation, that people who disagree with him turn out to have surprisingly good reasons to do so?
Do you think—to jump all the way back to the original question—that if Dai went around asking people “Why aren’t you working on decision theory and anthropics because you can’t stand not knowing the answers?” that they would have some brilliantly decisive comeback that Dai never thought of which makes Dai realize that he shouldn’t be spending time on the topic either? What odds would you bet at?
Brilliant decisive reasons are rare for most topics, and most people can’t articulate very many of their reasons for most of their choices. Their most common reason would probably be that they found other topics more interesting, and to evaluate that reason Wei would have to understand the reasons for thinking all those other topics interesting. Saying “if you can’t prove to me why I’m wrong in ten minutes I must be right” is not a very reliable path to truth.
On this planet? No. On this planet, I think you’re better off just worrying about the object-level state of the evidence. Your visceral nervousness has nothing to do with Aumann. It is conformity.
What do you care what people who are crazier than average do? You already have enough information to know you’re not one of them. You care what these people do, not because you really truly seriously think you might be one of them, but because of the gut-level, bone-deep fear of losing status by seeming to affiliate with a low-prestige group by saying something that sounds similar to what they say. You may be reluctant to admit that you know perfectly well you’re not in this group, because that also sounds like something this low-prestige group would say; but in real life, you have enough info, you know you have enough info, and the thought has not seriously crossed your mind in a good long while, whatever your dutiful doubts of your foregone conclusion.
Seriously, just make the break, clean snap, over and done.
Occam’s Imaginary Razor. Spending lots of time on the meta-level explaining away what other people think is bad for your mental health.
You’re wrong, Elizer. I am sure that I’m not crazier than average, and I’m not reluctant to admit that. But in order to disagree with most of the world, I have to have good reason to think that I’m more rational than everyone I disagree with, or have some other explanation that lets me ignore Aumann. The only reason I referred to people who are crazier than average is to explain why “people are crazy, the world is mad” is not one of those explanations.
That’s only true if I’m looking for rationalizations, instead of real explanations, right? If so, noted, and I’ll try to be careful.
You’re more rational than the vast majority of people you disagree with. There, I told you up front. Is that reason enough? I can understand why you’d doubt yourself, but why should you doubt me?
I’m not saying that you should deliberately stay ignorant or avoid thinking about it, but I suspect that some of the mental health effects of spending lots of time analyzing away other people’s disagreements would happen to you even if you miraculously zeroed in on the true answer every time. Which you won’t. So it may not be wise to deliberately invest extra thought-time here.
Or maybe divide healthy and risky as follows: Healthy is what you do when you have a serious doubt and are moving to resolve it, for example by reading more of the literature, not to fulfill a duty or prove something to yourself, but because you seriously think there may be stuff out there you haven’t read. Risky is anything you do because you want to have investigated in order to prove your own rationality to yourself, or because it would feel too immodest to just think outright that you had the right answer.
It is if you stick to the object level. Does it help if I rephrase it as “People are crazy, the world is mad, therefore everyone has to show their work”? You just shouldn’t have to spend all that much effort to suppose that a large number of people have been incompetent. It happens so frequently that if there were a Shannon code for describing Earth, “they’re nuts” would have a single-symbol code in the language. Now, if you seriously don’t know whether someone else knows something you don’t, then figure out where to look and look there. But the answer may just be “4”, which stands for Standard Explanation #4 in the Earth Description Language: “People are crazy, the world is mad”. And in that case, spending lots of effort in order to develop an elaborate dismissal of their reasons is probably not good for your mental health and will just slow you down later if it turns out they did know something else. If by a flash of insight you realize there’s a compact description of a mistake that a lot of other people are making, then this is a valuable thing to know so you can avoid it yourself; but I really think it’s important to learn how to just say “4” and move on.
I will come as a surprise to few people that I disagree strongly with Eliezer here; Wei should not take his word for the claim that Wei is so much more rational than all the folks he might disagree with that he can ignore their differing opinions. Where is this robust rationality test used to compare Wei to the rest of the intellectual world? Where is the evidence for this supposed mental health risk of considering the important evidence of the opinions of other? If the world is crazy, then very likely so are you. Yes it is a good sign if you can show some of your work, but you can almost never show all of your relevant work. So we must make inferences about the thought we have not seen.
Well, I think we both agree on the dangers of a wide variety of cheap talk—or to put it more humbly, you taught me on the subject. Though even before then, I had developed the unfortunate personal habit of calling people’s bluffs.
So while we can certainly interpret talk about modesty and immodesty in terms of rhetoric, isn’t the main testable prediction at stake, the degree to which Wei Dai should often find, on further investigation, that people who disagree with him turn out to have surprisingly good reasons to do so?
Do you think—to jump all the way back to the original question—that if Dai went around asking people “Why aren’t you working on decision theory and anthropics because you can’t stand not knowing the answers?” that they would have some brilliantly decisive comeback that Dai never thought of which makes Dai realize that he shouldn’t be spending time on the topic either? What odds would you bet at?
Brilliant decisive reasons are rare for most topics, and most people can’t articulate very many of their reasons for most of their choices. Their most common reason would probably be that they found other topics more interesting, and to evaluate that reason Wei would have to understand the reasons for thinking all those other topics interesting. Saying “if you can’t prove to me why I’m wrong in ten minutes I must be right” is not a very reliable path to truth.
I’d expect a lot of people to answer “Nobody is paying me to work on it.”