I was expecting to find someone commenting about beliefs whose truth-value may be hard to know but whose effect is positive nonetheless. Several examples (which I don’t necessarily personally endorse)
If believing this homeopathic sugar pill works will make it work, I desire to believe that this sugar pill works. If believing this homeopathic sugar pill works will not make it work, I desire to believe that this sugar pill does not work. Let me not become attached to beliefs that do not serve me.
or
If believing in synchronicities will cause more good things to happen in my life, I desire to believe in synchronicities. If believing in synchronicities will not cause more good things to happen in my life, I desire to not believe in synchronicities. Let me not become attached to beliefs I do not want.
It appears that, if you have the ability to actually self-modify your beliefs as such, the “Litany of Instrumentarski” could be a useful way to deal with the thing where rationality breaks things like the placebo effect. Sugar pills, or whatever, if you can adopt the positive sides of beliefs that are self-fulfilling prophecies (true either way you believe them, like e.g. the Pygmalion effect) then that ought to be conducive to winning.
That’s a good point. I guess I still have to quite a ways to go to rid myself of the notion of external reality, which I was subconsciously assuming. If belief changes reality, too bad for reality. It’s the accuracy of the belief that is important.
Self-fulfilling beliefs don’t mean there’s no external reality, they just mean your mind (and thus your beliefs) are part of reality, and therefore capable of influencing it. If they weren’t, naturally you would be unable to act on them in any case. The correct belief is, of course, “if someone believes X, X will occur. If someone believes Y, Y will occur.”
EDIT: The last sentence, which is slightly tangential to the rest, has been moved (on the theory that it was attracting downvotes) to increase the signal-to-noise ratio. It still exists in the comment below, if you wish to downvote it.
Huh, I hadn’t noticed that. You’re probably right; such a statement is something of an anti-applause light here on lesswrong. (And, to be fair, with good reason.)
EDIT: I think I’ll remove it, actually … I’ll move it to a comment so as not to torture the poor souls who saw this cryptic conversation.
But if the belief is accurate either way, then you can basically pick whatever belief you want. This is the weird paradox of self-fulfilling prophecies, like the Pygmalion effect. So what then?
I was expecting to find someone commenting about beliefs whose truth-value may be hard to know but whose effect is positive nonetheless. Several examples (which I don’t necessarily personally endorse)
If believing this homeopathic sugar pill works will make it work,
I desire to believe that this sugar pill works.
If believing this homeopathic sugar pill works will not make it work,
I desire to believe that this sugar pill does not work.
Let me not become attached to beliefs that do not serve me.
or
If believing in synchronicities will cause more good things to happen in my life,
I desire to believe in synchronicities.
If believing in synchronicities will not cause more good things to happen in my life,
I desire to not believe in synchronicities.
Let me not become attached to beliefs I do not want.
It appears that, if you have the ability to actually self-modify your beliefs as such, the “Litany of Instrumentarski” could be a useful way to deal with the thing where rationality breaks things like the placebo effect. Sugar pills, or whatever, if you can adopt the positive sides of beliefs that are self-fulfilling prophecies (true either way you believe them, like e.g. the Pygmalion effect) then that ought to be conducive to winning.
That’s a good point. I guess I still have to quite a ways to go to rid myself of the notion of external reality, which I was subconsciously assuming. If belief changes reality, too bad for reality. It’s the accuracy of the belief that is important.
What’s ‘accuracy’ without ‘reality’?
Self-fulfilling beliefs don’t mean there’s no external reality, they just mean your mind (and thus your beliefs) are part of reality, and therefore capable of influencing it. If they weren’t, naturally you would be unable to act on them in any case. The correct belief is, of course, “if someone believes X, X will occur. If someone believes Y, Y will occur.”
EDIT: The last sentence, which is slightly tangential to the rest, has been moved (on the theory that it was attracting downvotes) to increase the signal-to-noise ratio. It still exists in the comment below, if you wish to downvote it.
-removed from the above post, for the curious and/or offended.
That seems like a responable response to shminux’s post, so I’m not sure why you were at −2 (unless it was for your final sentence).
Huh, I hadn’t noticed that. You’re probably right; such a statement is something of an anti-applause light here on lesswrong. (And, to be fair, with good reason.)
EDIT: I think I’ll remove it, actually … I’ll move it to a comment so as not to torture the poor souls who saw this cryptic conversation.
But if the belief is accurate either way, then you can basically pick whatever belief you want. This is the weird paradox of self-fulfilling prophecies, like the Pygmalion effect. So what then?