My experience is that LWers are accepting of a lot of cognitive strategies, even ones that are not truth optimizing. See post about the dark arts for example https://www.lesswrong.com/tag/dark-arts
There are beliefs I would not update if an endless amount of evidence against them came my way, because they are how I keep myself from suicidal ideation.
That sounds like a load bearing bug. We discussed these types of structures at the CFAR workshop I attended. Please don’t remove the structure that keeps you from being suicidal.
What I’m trying to gesture at is: Even if you are not a typical LWer, I don’t think you are as far off the distribution as you think.
Thank you for the reassurance. Honestly, I am not sure if what I said there while feeling bad is entirely true; I perceive the world in radically different ways in different emotional states and when I feel okay the idea of changing my mind doesn’t seem like as big a deal. Still big, but not impossible and soul-crushing. Just terrifying and life-changing lol.
Also, a lot of what I used to call “beliefs” were actually more like… hm, I know there’s an Eliezer post for this, but I’m not sure which… mantras, in a sense. Stuff that you say it and you go “yay” and feel very faithful and spiritual for having said it but you don’t actually stop to think about whether it means anything. I still have a huge amount of such patterns left over from my possibly-schizotypal early teen years, but nowadays I am trying to redefine beliefs in terms of what predictions about future experiences I am making.
I do not have to believe in false things in order to be mentally healthy—I just have to have the right values, and I think that’s what makes me scared when talking to people—I feel like people are going to try to make me change my value system, which puts “spirituality” (which feels like a very specific real thing to me but is very hard to define quickly) at a very high level of importance. And changing my value system is tantamount to a kind of death in itself.
My experience is that LWers are accepting of a lot of cognitive strategies, even ones that are not truth optimizing. See post about the dark arts for example https://www.lesswrong.com/tag/dark-arts
That sounds like a load bearing bug. We discussed these types of structures at the CFAR workshop I attended. Please don’t remove the structure that keeps you from being suicidal.
What I’m trying to gesture at is: Even if you are not a typical LWer, I don’t think you are as far off the distribution as you think.
Thank you for the reassurance. Honestly, I am not sure if what I said there while feeling bad is entirely true; I perceive the world in radically different ways in different emotional states and when I feel okay the idea of changing my mind doesn’t seem like as big a deal. Still big, but not impossible and soul-crushing. Just terrifying and life-changing lol.
Also, a lot of what I used to call “beliefs” were actually more like… hm, I know there’s an Eliezer post for this, but I’m not sure which… mantras, in a sense. Stuff that you say it and you go “yay” and feel very faithful and spiritual for having said it but you don’t actually stop to think about whether it means anything. I still have a huge amount of such patterns left over from my possibly-schizotypal early teen years, but nowadays I am trying to redefine beliefs in terms of what predictions about future experiences I am making.
I do not have to believe in false things in order to be mentally healthy—I just have to have the right values, and I think that’s what makes me scared when talking to people—I feel like people are going to try to make me change my value system, which puts “spirituality” (which feels like a very specific real thing to me but is very hard to define quickly) at a very high level of importance. And changing my value system is tantamount to a kind of death in itself.