My complete non-reaction to this (aside from being convinced utterly and without counterargument) suggests to me that I have “achieved” the “break with the world” that Eliezer sometimes goes on about as a required first step to becoming a rationalist. That’s kind of a relief but not really.
A huge part of having emotional convulsions when dealing with having your beliefs broken, I suspect, is identifying with your beliefs, something that is suggested piecemeal, in bits and pieces, again and again and again and again.
I think that we may face the opposite danger though, of being too prepared to accept claims that are issued with challenges to show our rational impartiality. I think that’s the sort of reaction I caught myself having when I read it.
I would guess that it’s more the case of people here having more flexible priors. I find that I’m generally less certain of the truth of my beliefs than most people. By my estimation, most people are damn sure of things that they have no way of knowing anything about. I’m probably a defense attorney’s wet dream, because I always have doubt.
I worry about this all the time, but honestly, we know rationality is hard, and we know you’re never quite sure you’re doing it right. Because humans tend to try to validate their own beliefs way harder than than conflicting ones, applying the virtue of evenness probably means closer to rejecting your own beliefs if possible than “searching equally hard for flaws in your own arguments as well as others”. If you state the latter goal, you are likely to fall into one of the many, many traps that let you think you were right all along.
But I do worry about it all the time. What if, instead of being a leaf on the wind, I’m a rocket-propelled leaf trying to figure out the direction of the wind prematurely and flying off into space, never to be seen again? Then I’d be in space, and there’s no oxygen there, and leaves can’t survive without oxygen.
I do think it’s possible to overcome. If there’s any rationalist skill I feel I’ve developed to a notable level, it’s the ability to scrutinize my own internal monologue as it occurs rather than trying to work out after the fact what I was thinking, so while part of me was urging me to accept it to prove my impartiality to myself, I was able to notice this as I was reading it.
it’s the ability to scrutinize my own internal monologue as it occurs rather
Do you have any advise on how not to become internally polarized? I sometimes find myself wanting something to be true but then when I realize that (I want something to be true) some part of me try to compensate for my “emotional favoritism”, and I end up with one side dismissing anything I would like to be true in a somewhat compulsive manner and another side inducing negative emotions every time it’s candidate gets knocked down.
I try to emulate the views of an impartial person in internal monologue. I think it helps to engage in a lot of debates and discussions and take pains to observe the differences between people who have incentives to engage in motivated reasoning and people who don’t, so you can notice in yourself “That doesn’t seem like a way I’d respond if I weren’t engaging in motivated reasoning,” or “that really does seem like what I’d expect from an unbiased person.”
My complete non-reaction to this (aside from being convinced utterly and without counterargument) suggests to me that I have “achieved” the “break with the world” that Eliezer sometimes goes on about as a required first step to becoming a rationalist.
Your comment here reminds me that I have much more work to do! My thought process usually takes several steps before I come to the right conclusion, like:
“But I like fireplaces!”
“But I like my safety and the safety of the people I care about more.”
“Holy shit, I can’t believe I just put 1 before 2.”
Reminds self of desire to see reality, calms mind, focuses on processing the article.
My complete non-reaction to this (aside from being convinced utterly and without counterargument) suggests to me that I have “achieved” the “break with the world” that Eliezer sometimes goes on about as a required first step to becoming a rationalist. That’s kind of a relief but not really.
A huge part of having emotional convulsions when dealing with having your beliefs broken, I suspect, is identifying with your beliefs, something that is suggested piecemeal, in bits and pieces, again and again and again and again.
I think that we may face the opposite danger though, of being too prepared to accept claims that are issued with challenges to show our rational impartiality. I think that’s the sort of reaction I caught myself having when I read it.
I would guess that it’s more the case of people here having more flexible priors. I find that I’m generally less certain of the truth of my beliefs than most people. By my estimation, most people are damn sure of things that they have no way of knowing anything about. I’m probably a defense attorney’s wet dream, because I always have doubt.
I worry about this all the time, but honestly, we know rationality is hard, and we know you’re never quite sure you’re doing it right. Because humans tend to try to validate their own beliefs way harder than than conflicting ones, applying the virtue of evenness probably means closer to rejecting your own beliefs if possible than “searching equally hard for flaws in your own arguments as well as others”. If you state the latter goal, you are likely to fall into one of the many, many traps that let you think you were right all along.
But I do worry about it all the time. What if, instead of being a leaf on the wind, I’m a rocket-propelled leaf trying to figure out the direction of the wind prematurely and flying off into space, never to be seen again? Then I’d be in space, and there’s no oxygen there, and leaves can’t survive without oxygen.
I do think it’s possible to overcome. If there’s any rationalist skill I feel I’ve developed to a notable level, it’s the ability to scrutinize my own internal monologue as it occurs rather than trying to work out after the fact what I was thinking, so while part of me was urging me to accept it to prove my impartiality to myself, I was able to notice this as I was reading it.
Do you have any advise on how not to become internally polarized? I sometimes find myself wanting something to be true but then when I realize that (I want something to be true) some part of me try to compensate for my “emotional favoritism”, and I end up with one side dismissing anything I would like to be true in a somewhat compulsive manner and another side inducing negative emotions every time it’s candidate gets knocked down.
I try to emulate the views of an impartial person in internal monologue. I think it helps to engage in a lot of debates and discussions and take pains to observe the differences between people who have incentives to engage in motivated reasoning and people who don’t, so you can notice in yourself “That doesn’t seem like a way I’d respond if I weren’t engaging in motivated reasoning,” or “that really does seem like what I’d expect from an unbiased person.”
Your comment here reminds me that I have much more work to do! My thought process usually takes several steps before I come to the right conclusion, like:
“But I like fireplaces!”
“But I like my safety and the safety of the people I care about more.”
“Holy shit, I can’t believe I just put 1 before 2.”
Reminds self of desire to see reality, calms mind, focuses on processing the article.