I guess you just kind of learn by experience when to not trust yourself. For example, I avoid writing late at night, because I know it usually comes out too emotional, even though I don’t notice it at the time. For me it works better to come up with cool ideas at night and then write them up in the morning. Also when drawing I pay extra attention to symmetry, because I know that I’m a bit blind to symmetry flaws in my own drawings. Maybe something similar could work for politics.
How do you deal with noticing you’ve been mindkilled?
When I realize that my emotions are about whether I am winning or losing a debate, as opposed to… uhm… the feeling of exploration. When it becomes about the social feedback (looking at other people instead of looking at the territory), instead of exploring various possible paths. When it seems like the multiple possible paths are not even there.
Worst (and also the typical) case is that I notice this only afterwards, when the debate is over. Best case is that I notice it in the middle of a (typically online) debate, and then I lift my fingers from the keyboard and take a short walk (often taking some water to drink). Sometimes I just erase the unfinished comment, and close the browser tab—and this is sometimes more difficult than it should be.
Afterwards, I try to (1) calm down, then (2) think about what I should have said instead, and (3) find someone rational and ask them “at this moment my opinion on the topic is X, does it seem like a sane opinion to you?”. The answer is often “approximately yes, but I would also add Y and Z”, then I think about it, and maybe update somewhat.
Note that I don’t try to get necessarily to a perfect agreement with the other rational person, more like to remain in—what would be a wannabe-rationalist’s equivalent of “Overton window”? -- something like “you may seem a bit miscalibrated to me, but not in a way/degree that makes me suspect your sanity” or “we have approximately the same model of the world, just with more emphasis on different parts”.
My private opinion is usually something that all political groups would consider a heresy, and it usually admits that each of them has a point about something but blows it out of proportions. That doesn’t mean that I pretend to be wise and neutral; I may agree with one side on 90% and other side on 10%; or it’s not even about percents, but more like “they are right about this specific detail, they just put it into completely wrong context and thus draw completely wrong conclusions” or “these are doing it mostly right, but then suddenly here they have a huge blind spot and go batshit crazy when anyone touches it, what a pity”.
It seems to me that a good thing against mindkilling is to remind yourself, whenever people discuss some group, that the group consists of multiple individuals, each of them different than the others. And that any statement someone says is possibly true about some members, and false about some members. Not as a dogmatic conclusion, but more like a reasonable “null hypothesis”.
Well how do you deal with noticing some other error or that you’ve fallen prey to some other bias? What is different in ‘tribalism’ that makes it so distressing?
I’ll half-answer this, since it’s sort of a tangent, but the metric I prefer to use is my variation of feeling over time. I don’t know if other people are like this (probably), but my mood/emotion impacts my view on politics/policy.
Sometimes when I feel ill or in a bad mood some political event of class A will make me upset and convinced everything will turn out poorly. After I lift weights when I’m on my (perceived) good feeling Testosterone hormones, I feel confident that political event of class A won’t be a big deal, and I’m confident in my ability to persevere. Usually I take this variance in my prediction of the future as evidence I’m being mind-killed.
Another strategy, if reading one or two articles on a topic you already know a fair amount on makes you feel strong emotions and strongly change your prediction of the future, you might be mind-killed.
A final strategy I tried was subscribing to different political meme pages on FB (libertarian, Ann Coulter-ish, Alt-right, progressive), and I’d notice how I sometimes would slowly change my view based on which ones I was looking at. I know admitting to subconsciously changing political views based on political memes is about as embarrassing as saying you went and bought a Taco Bell meal because of a Taco Bell commercial—but as far as I can tell we are very perceptible to this stuff, even the stupidest memes. (Sometimes even if I hate them, I start substituting them deep in my mind for the ‘other sides’ actual argument).
How do you deal with noticing you’ve been mindkilled?
I recently did something I regret, and on reflection I note that the impetus was probably anti-Purple sentiment.
(ironically, I managed to simultaneously demonstrate some epic hypocrisy; I was inveighing against mindkilling anti-Orange sentiment at the time)
This bothers me. I’ve done what I can to repair the error, but it still bothers me. I assume it’s not an uncommon experience, though. Thoughts?
I guess you just kind of learn by experience when to not trust yourself. For example, I avoid writing late at night, because I know it usually comes out too emotional, even though I don’t notice it at the time. For me it works better to come up with cool ideas at night and then write them up in the morning. Also when drawing I pay extra attention to symmetry, because I know that I’m a bit blind to symmetry flaws in my own drawings. Maybe something similar could work for politics.
When I realize that my emotions are about whether I am winning or losing a debate, as opposed to… uhm… the feeling of exploration. When it becomes about the social feedback (looking at other people instead of looking at the territory), instead of exploring various possible paths. When it seems like the multiple possible paths are not even there.
Worst (and also the typical) case is that I notice this only afterwards, when the debate is over. Best case is that I notice it in the middle of a (typically online) debate, and then I lift my fingers from the keyboard and take a short walk (often taking some water to drink). Sometimes I just erase the unfinished comment, and close the browser tab—and this is sometimes more difficult than it should be.
Afterwards, I try to (1) calm down, then (2) think about what I should have said instead, and (3) find someone rational and ask them “at this moment my opinion on the topic is X, does it seem like a sane opinion to you?”. The answer is often “approximately yes, but I would also add Y and Z”, then I think about it, and maybe update somewhat.
Note that I don’t try to get necessarily to a perfect agreement with the other rational person, more like to remain in—what would be a wannabe-rationalist’s equivalent of “Overton window”? -- something like “you may seem a bit miscalibrated to me, but not in a way/degree that makes me suspect your sanity” or “we have approximately the same model of the world, just with more emphasis on different parts”.
My private opinion is usually something that all political groups would consider a heresy, and it usually admits that each of them has a point about something but blows it out of proportions. That doesn’t mean that I pretend to be wise and neutral; I may agree with one side on 90% and other side on 10%; or it’s not even about percents, but more like “they are right about this specific detail, they just put it into completely wrong context and thus draw completely wrong conclusions” or “these are doing it mostly right, but then suddenly here they have a huge blind spot and go batshit crazy when anyone touches it, what a pity”.
It seems to me that a good thing against mindkilling is to remind yourself, whenever people discuss some group, that the group consists of multiple individuals, each of them different than the others. And that any statement someone says is possibly true about some members, and false about some members. Not as a dogmatic conclusion, but more like a reasonable “null hypothesis”.
Well how do you deal with noticing some other error or that you’ve fallen prey to some other bias? What is different in ‘tribalism’ that makes it so distressing?
I’ll half-answer this, since it’s sort of a tangent, but the metric I prefer to use is my variation of feeling over time. I don’t know if other people are like this (probably), but my mood/emotion impacts my view on politics/policy.
Sometimes when I feel ill or in a bad mood some political event of class A will make me upset and convinced everything will turn out poorly. After I lift weights when I’m on my (perceived) good feeling Testosterone hormones, I feel confident that political event of class A won’t be a big deal, and I’m confident in my ability to persevere. Usually I take this variance in my prediction of the future as evidence I’m being mind-killed.
Another strategy, if reading one or two articles on a topic you already know a fair amount on makes you feel strong emotions and strongly change your prediction of the future, you might be mind-killed.
A final strategy I tried was subscribing to different political meme pages on FB (libertarian, Ann Coulter-ish, Alt-right, progressive), and I’d notice how I sometimes would slowly change my view based on which ones I was looking at. I know admitting to subconsciously changing political views based on political memes is about as embarrassing as saying you went and bought a Taco Bell meal because of a Taco Bell commercial—but as far as I can tell we are very perceptible to this stuff, even the stupidest memes. (Sometimes even if I hate them, I start substituting them deep in my mind for the ‘other sides’ actual argument).
Anyway, those are a few of my tactics.