Whenever I read something on Less Wrong about how to change my mind, I feel guilty for not immediately changing my mind about everything I believe. This post especially. I’ve already examined my beliefs and concluded they are absolutely worthy, I’ve already taken all the advice on how to maintain rational beliefs, but the style of these posts makes me feel so guilty for being as committed as I am to what I am fairly sure are rational beliefs. Of course, I hope this comment doesn’t lead anyone to believe that they don’t need to relentlessly focus on changing their mind. Recognizing that it is hard and annoying to be constantly vigilant is not an excuse to not be.
On the other hand, it could be that I’ve just internalized the rhetoric and made myself immune to the Less Wrong style of belief-correction. Reading this post, for example, I noted with satisfaction that I believe that following my “sacred beliefs” is in contradiction with following “animal urges” like enjoying myself or morality. But even asceticism, radicalism can be a defense for some perniciously deep-seated wrong idea. The only genuine defense against irrationality is constant self-examination; the only genuinely problematic beliefs are those that bias or otherwise prevent one’s self-examination.
Whenever I read something on Less Wrong about how to change my mind, I feel guilty for not changing my mind.
Change your mind. Seriously. Identify the underlying beliefs that result in the guilt, assess whether they are rational or beneficial and then change them. Because they are neither. Guilt (usually) sucks as an ongoing motivator.
I don’t think you quite understood my meaning. I can see why, though, as my post is not very clear. Edited it a little.
I don’t really have anything significant to change my mind about, as I’m reasonably certain that my major beliefs are without error. I just feel a social pressure to change my mind because many of these posts on Changing Your Mind seem to decry having any level of certainty that your beliefs are rational and correct. I feel guilty that I have that certainty, which I think is justified, when I supposedly should not.
Wedrifid’s comment still applies. Examine the social pressure, identify how it produces guilt in your mind, and then change your mind so that it doesn’t produce guilt no more.
Recently I’ve really liked the “brain as cognitive engine” metaphor, so in that vein I offer you a different interpretation of what “change your mind” means: altering your brain. So changing your mind is no longer “I believed X, but now I believe Y” and is more like “My brain used to generate X but I shut off the Z input and removed the Q cogitator and now it generates Y”.
I’ve seen it said here a lot that overconfidence is a problem, but so is underconfidence. If you think your certainty (or more ideally, near-certainty) is justified, and you can explain why with reasons, any social pressure to be less confident you might be perceiving would be misplaced.
I noted with satisfaction that I believe that following my “sacred beliefs” is in contradiction with following “animal urges” like enjoying myself or morality
Oh, it’s just a fairly straightforward notion that considering my limited resources, I should pursue eternal goals rather than any personal interests, but that personal interests are constantly thwarting my effort to pursue eternal goals. Fairly standard akrasia stuff, I guess I could have made that more clear.
In local terminology, “morality” refers to the meaningful kind of “eternal goals”, and some notions of “eternal goals” are seen as confusion, so your original statement remain unclear.
I don’t wish to reveal that “more important value”, because I think it would be very distracting
From this alone I expect that you have something to change your mind about. Don’t avoid discussing it, or at least have a plan for developing new epistemic tools. :-)
it’s just a value that if revealed would derail any and all threads
By saying that in a community as insatiably curious as LW you now have dozens of people (including me) persistently wondering what the heck it could be.
I think you’re maybe making it a lot worse by being deliberately coy? If you actually wanted to avoid derailing a thread, wink-nudge-hint wasn’t the way to go. I’m not even sure why it was necessary to mention your secret objectionable value at all if you truly didn’t want to talk about it.
The reason I mentioned epistemic tools is that it’s possible to be wrong about what your own values are, but people sometimes don’t easily accept this idea. Where you expect people finding your value objectionable, I expect people seeing you as mistaken about the fact of this value actually being your own. You believe it is, but it’s probably not (based on what indirect evidence you revealed).
Yea, I sometimes get the unhelthy impulse to wish I were more wrong so I could discover it and have something to change my mind about, or to change my mind about things randomly despite that leading to much less accurate beliefs. (that I don’t act on these impulses shouldn’t need saying, and probably dosn’t.)
Whenever I read something on Less Wrong about how to change my mind, I feel guilty for not immediately changing my mind about everything I believe. This post especially. I’ve already examined my beliefs and concluded they are absolutely worthy, I’ve already taken all the advice on how to maintain rational beliefs, but the style of these posts makes me feel so guilty for being as committed as I am to what I am fairly sure are rational beliefs. Of course, I hope this comment doesn’t lead anyone to believe that they don’t need to relentlessly focus on changing their mind. Recognizing that it is hard and annoying to be constantly vigilant is not an excuse to not be.
On the other hand, it could be that I’ve just internalized the rhetoric and made myself immune to the Less Wrong style of belief-correction. Reading this post, for example, I noted with satisfaction that I believe that following my “sacred beliefs” is in contradiction with following “animal urges” like enjoying myself or morality. But even asceticism, radicalism can be a defense for some perniciously deep-seated wrong idea. The only genuine defense against irrationality is constant self-examination; the only genuinely problematic beliefs are those that bias or otherwise prevent one’s self-examination.
Change your mind. Seriously. Identify the underlying beliefs that result in the guilt, assess whether they are rational or beneficial and then change them. Because they are neither. Guilt (usually) sucks as an ongoing motivator.
I don’t think you quite understood my meaning. I can see why, though, as my post is not very clear. Edited it a little.
I don’t really have anything significant to change my mind about, as I’m reasonably certain that my major beliefs are without error. I just feel a social pressure to change my mind because many of these posts on Changing Your Mind seem to decry having any level of certainty that your beliefs are rational and correct. I feel guilty that I have that certainty, which I think is justified, when I supposedly should not.
Wedrifid’s comment still applies. Examine the social pressure, identify how it produces guilt in your mind, and then change your mind so that it doesn’t produce guilt no more.
Recently I’ve really liked the “brain as cognitive engine” metaphor, so in that vein I offer you a different interpretation of what “change your mind” means: altering your brain. So changing your mind is no longer “I believed X, but now I believe Y” and is more like “My brain used to generate X but I shut off the Z input and removed the Q cogitator and now it generates Y”.
I’ve seen it said here a lot that overconfidence is a problem, but so is underconfidence. If you think your certainty (or more ideally, near-certainty) is justified, and you can explain why with reasons, any social pressure to be less confident you might be perceiving would be misplaced.
Could you expound upon this?
Oh, it’s just a fairly straightforward notion that considering my limited resources, I should pursue eternal goals rather than any personal interests, but that personal interests are constantly thwarting my effort to pursue eternal goals. Fairly standard akrasia stuff, I guess I could have made that more clear.
In local terminology, “morality” refers to the meaningful kind of “eternal goals”, and some notions of “eternal goals” are seen as confusion, so your original statement remain unclear.
From this alone I expect that you have something to change your mind about. Don’t avoid discussing it, or at least have a plan for developing new epistemic tools. :-)
But now I’m curious :(
By saying that in a community as insatiably curious as LW you now have dozens of people (including me) persistently wondering what the heck it could be.
:)
p=.65 it’s either political or sexual in nature.
Let’s find a bookmaker who will accept bets on my secret value. I’ll take 10% and we can resolve all issues. Bets above 100$ only please.
I think you’re maybe making it a lot worse by being deliberately coy? If you actually wanted to avoid derailing a thread, wink-nudge-hint wasn’t the way to go. I’m not even sure why it was necessary to mention your secret objectionable value at all if you truly didn’t want to talk about it.
Oh, of course I am. I’ve already derailed this with my momentary lapse in judgment, though, so I don’t see any harm in continuing to respond.
If you look closely, you’ll notice that it was your relentless evasiveness rather than the belief itself which has caused that derailment.
The reason I mentioned epistemic tools is that it’s possible to be wrong about what your own values are, but people sometimes don’t easily accept this idea. Where you expect people finding your value objectionable, I expect people seeing you as mistaken about the fact of this value actually being your own. You believe it is, but it’s probably not (based on what indirect evidence you revealed).
You would be surprised, maybe… If you don’t want to derail a public thread, would you send me a private message to discuss this?
Yea, I sometimes get the unhelthy impulse to wish I were more wrong so I could discover it and have something to change my mind about, or to change my mind about things randomly despite that leading to much less accurate beliefs. (that I don’t act on these impulses shouldn’t need saying, and probably dosn’t.)