It is clear that Eleizer advocates telling yourself things that may not actually be true.
I don’t think so. He is advocating telling yourself something on the condition that telling it to yourself causes it to be true.
It’s not equivalent to telling yourself “I’m attractive to the opposite sex.” Say that you doubted this prior to uttering it. Then, yes, after uttering it, you might have reason to think that it is marginally more likely to be true. But you almost certainly wouldn’t be justified in believing it with high confidence. That is, you still shouldn’t believe the statement, so telling it to yourself is dishonest.
In contrast, Eliezer is suggesting that perhaps regularly uttering the statement
I can’t get away with double-thinking! Deep down, I’ll know it’s not true! If I know my map has no reason to be correlated with the territory, that means I don’t believe it!
does alter you so as to make itself true. If that’s right, then, conditioned on your having uttered it, you are justified in believing what you uttered, so you are not being dishonest.
It’s not a matter of being outside of reality. The utterance is part of reality. That’s precisely why it may have the power to cause itself to be true.
Of course, it may be that this particular statement just doesn’t have that power. If the probability of that were above a certain threshold, I expect that Eliezer wouldn’t advocate saying it unless it’s true already.
What evidence is there that yelling at yourself like this is going to make a difference? Let us imagine two kinds of people: those who cannot fall into Moore’s paradox (believing the map but not the territory) and those who can. People in the first class, who are immune to the problem, will gain no benefit from reciting these mantras. People in the second class, for whom there is a real risk of making these kinds of errors, are supposed to vigorously tell themselves that there is no such risk! They are supposed to lie to themselves in the hope that the lie will become true. But why should they believe it?
And how different is this lie, really, from the wannabe god-worshiper who similarly insists to himself that he believes that god exists, even though it is not true?
I can’t help wondering whether this posting is meant to be ironic. It comes perilously close to outright self-contradiction.
Hal, perhaps Eliezer’s view is that there are “suggestible” portions of one’s mind that it is okay to suggest things to, but there is some other, reason-capable faculty that one can and should use to form true, un-self-deceived, evidence before bottom line, beliefs.
Whether or not that’s Eliezer’s view, the above view seems right to me. It would be silly not to suggest useful frames, emotional stances, energy levels, etc. to the less rational parts of myself—that would leave me freezing in particular, arbitrary/chance/un-useful starting states. But for the part of myself that can do full cost-benefit analyses, and math, and can assemble my best guess about the world—misleading that part of myself would be terrifying, like putting my eyes out. (I mean, I deceive the reason-capable part of myself all the time, like most humans. But it’s terrifying that I do, and I really really want to do otherwise… including by suggestibility tricks, if they turn out to help.)
Tyrrell and Anna have stated my views better than I’d previously gone so far as verbalizing.
There are large sectors of the mind in which belief tends to become reality, including important things like “I am the sort of person who continues even in the face of adversity” and “I do have the willpower to pass up that cookie.”
But—given that you aren’t actually trying to fool yourself—there’s a chicken-and-egg aspect that depends on your having enough potential in this area that you can legitimately believe the statement will become true if you believe it. At that point, you can believe it and then it will be true.
There’s an interesting analogy here to Lob’s Theorem which I haven’t yet categorized as legitimate or fake.
To look at it another way, this sort of thing is useful for taking simultaneous steps of self-confidence and actual capability in cases where the two move in lockstep. Or, in the case of anti-competencies like doublethink, the reverse.
“I have the potential to be the sort of person who continues even in the face of adversity”, or “it is more in my interests to pass up that cookie”, or “I really do have a choice whether or not to pass up that cookie”. That is what I would recommend.
bill, below, has mentioned “Act as if”: “I choose to Act as If I can continue even in the face of adversity, and I intend in this precise moment to continue acting, even if I may just fall down again in two minutes’ time”.
These have the advantages of being more likely to be true.
Rambling on a little, to be the sort of person who continues in the face of adversity is Difficult, and requires practice, and that practice is very worthwhile. Stating that it is True might make you fail to do the practice, and instead beat yourself up when it appears not to be true.
Dishonest or not, convincing yourself that you’re attractive to the opposite sex is more likely to produce a positive result. And a rationalist should win. ;-)
I don’t think so. He is advocating telling yourself something on the condition that telling it to yourself causes it to be true.
It’s not equivalent to telling yourself “I’m attractive to the opposite sex.” Say that you doubted this prior to uttering it. Then, yes, after uttering it, you might have reason to think that it is marginally more likely to be true. But you almost certainly wouldn’t be justified in believing it with high confidence. That is, you still shouldn’t believe the statement, so telling it to yourself is dishonest.
In contrast, Eliezer is suggesting that perhaps regularly uttering the statement
does alter you so as to make itself true. If that’s right, then, conditioned on your having uttered it, you are justified in believing what you uttered, so you are not being dishonest.
It’s not a matter of being outside of reality. The utterance is part of reality. That’s precisely why it may have the power to cause itself to be true.
Of course, it may be that this particular statement just doesn’t have that power. If the probability of that were above a certain threshold, I expect that Eliezer wouldn’t advocate saying it unless it’s true already.
What evidence is there that yelling at yourself like this is going to make a difference? Let us imagine two kinds of people: those who cannot fall into Moore’s paradox (believing the map but not the territory) and those who can. People in the first class, who are immune to the problem, will gain no benefit from reciting these mantras. People in the second class, for whom there is a real risk of making these kinds of errors, are supposed to vigorously tell themselves that there is no such risk! They are supposed to lie to themselves in the hope that the lie will become true. But why should they believe it?
And how different is this lie, really, from the wannabe god-worshiper who similarly insists to himself that he believes that god exists, even though it is not true?
I can’t help wondering whether this posting is meant to be ironic. It comes perilously close to outright self-contradiction.
Hal, perhaps Eliezer’s view is that there are “suggestible” portions of one’s mind that it is okay to suggest things to, but there is some other, reason-capable faculty that one can and should use to form true, un-self-deceived, evidence before bottom line, beliefs.
Whether or not that’s Eliezer’s view, the above view seems right to me. It would be silly not to suggest useful frames, emotional stances, energy levels, etc. to the less rational parts of myself—that would leave me freezing in particular, arbitrary/chance/un-useful starting states. But for the part of myself that can do full cost-benefit analyses, and math, and can assemble my best guess about the world—misleading that part of myself would be terrifying, like putting my eyes out. (I mean, I deceive the reason-capable part of myself all the time, like most humans. But it’s terrifying that I do, and I really really want to do otherwise… including by suggestibility tricks, if they turn out to help.)
Tyrrell and Anna have stated my views better than I’d previously gone so far as verbalizing.
There are large sectors of the mind in which belief tends to become reality, including important things like “I am the sort of person who continues even in the face of adversity” and “I do have the willpower to pass up that cookie.”
But—given that you aren’t actually trying to fool yourself—there’s a chicken-and-egg aspect that depends on your having enough potential in this area that you can legitimately believe the statement will become true if you believe it. At that point, you can believe it and then it will be true.
There’s an interesting analogy here to Lob’s Theorem which I haven’t yet categorized as legitimate or fake.
To look at it another way, this sort of thing is useful for taking simultaneous steps of self-confidence and actual capability in cases where the two move in lockstep. Or, in the case of anti-competencies like doublethink, the reverse.
“I have the potential to be the sort of person who continues even in the face of adversity”, or “it is more in my interests to pass up that cookie”, or “I really do have a choice whether or not to pass up that cookie”. That is what I would recommend.
bill, below, has mentioned “Act as if”: “I choose to Act as If I can continue even in the face of adversity, and I intend in this precise moment to continue acting, even if I may just fall down again in two minutes’ time”.
These have the advantages of being more likely to be true.
Rambling on a little, to be the sort of person who continues in the face of adversity is Difficult, and requires practice, and that practice is very worthwhile. Stating that it is True might make you fail to do the practice, and instead beat yourself up when it appears not to be true.
Dishonest or not, convincing yourself that you’re attractive to the opposite sex is more likely to produce a positive result. And a rationalist should win. ;-)