I think there is a question of whether current LessWrong is the right place for this discussion (there are topics that will attract unwanted attention, and when faced with substantial adversarial forces, I think it is OK for LessWrong to decide to avoid those topics as long as they don’t seem of crucial importance for the future of humanity, or have those discussions in more obscure ways, or to limit visibility to just some subset of logged-in users, etc). But leaving that discussion aside, basically everything in this post strikes me as “obviously true” and I had a very similar reaction to what the OP says now, when I first encountered the Eliezer Facebook post that this post is responding to.
And I do think that response mattered for my relationship to the rationality community. I did really feel like at the time that Eliezer was trying to make my map of the world worse, and it shifted my epistemic risk assessment of being part of the community from “I feel pretty confident in trusting my community leadership to maintain epistemic coherence in the presence of adversarial epistemic forces” to “well, I sure have to at least do a lot of straussian reading if I want to understand what people actually believe, and should expect that depending on the circumstances community leaders might make up sophisticated stories for why pretty obviously true things are false in order to not have to deal with complicated political issues”.
I do think that was the right update to make, and was overdetermined for many different reasons, though it still deeply saddens me.
“well, I sure have to at least do a lot of straussian reading if I want to understand what people actually believe, and should expect that depending on the circumstances community leaders might make up sophisticated stories for why pretty obviously true things are false in order to not have to deal with complicated political issues”
I kinda disagree that this is a mere issue of Straussian reading: I suspect that in this (and other cases), you are seeing the raw output of Elizer’s rationalizations and not some sort of instrumental coalition politics dark arts. If I was going for some sort of Straussian play, I wouldn’t bring it up unprompted or make long public declarations like this.
Zack is hypersensitive to this one issue because it interacts with his Something to Protect. But what I wonder about is where else Eliezer is trying to get away with things like this.
Yeah, I agree with this in this specific instance, hence the “at least do a lot of straussian reading”, part. I do think that there is a spectrum from radical honesty to straussian reading to something that looks like this, and that I do think it makes sense to consider the straussian case in many situations.
I think this is a good place for this post (especially because it respond directly to Eliezer, but even if it didn’t), and I would like to read more such posts here, and more posts related to Wokeness (as long as they meet the usual standard for posts here, like for any other topic). I don’t know if I want to start another “politics on LW” discussion, because there were already so many and many of them were good, but I’ll echo again the feeling that this distancing from politics often interferes with truth-seeking rather than helping it.
I think it changed my relationship to the community overall. In that relationship, Eliezer is actually still the single person I would probably trust the most, despite posts like this.
Um—why? I guess I can think of some mechanisms by which this would make you trust the remaining community less, but none that seem convincing. And I guess I don’t personally have the impression that the community is deteriorating.
I mean, evidence of an instance is evidence about the class. Also, again, I think all of these updates were pretty overdetermined, and my previous relationship to community leadership was quite naive and overly hopeful. I also don’t think the community is deteriorating and never intended to say anything like that! I think this is a normal relationship to have to the community and it’s leadership.
I think there is a question of whether current LessWrong is the right place for this discussion (there are topics that will attract unwanted attention, and when faced with substantial adversarial forces, I think it is OK for LessWrong to decide to avoid those topics as long as they don’t seem of crucial importance for the future of humanity, or have those discussions in more obscure ways, or to limit visibility to just some subset of logged-in users, etc). But leaving that discussion aside, basically everything in this post strikes me as “obviously true” and I had a very similar reaction to what the OP says now, when I first encountered the Eliezer Facebook post that this post is responding to.
And I do think that response mattered for my relationship to the rationality community. I did really feel like at the time that Eliezer was trying to make my map of the world worse, and it shifted my epistemic risk assessment of being part of the community from “I feel pretty confident in trusting my community leadership to maintain epistemic coherence in the presence of adversarial epistemic forces” to “well, I sure have to at least do a lot of straussian reading if I want to understand what people actually believe, and should expect that depending on the circumstances community leaders might make up sophisticated stories for why pretty obviously true things are false in order to not have to deal with complicated political issues”.
I do think that was the right update to make, and was overdetermined for many different reasons, though it still deeply saddens me.
I kinda disagree that this is a mere issue of Straussian reading: I suspect that in this (and other cases), you are seeing the raw output of Elizer’s rationalizations and not some sort of instrumental coalition politics dark arts. If I was going for some sort of Straussian play, I wouldn’t bring it up unprompted or make long public declarations like this.
Zack is hypersensitive to this one issue because it interacts with his Something to Protect. But what I wonder about is where else Eliezer is trying to get away with things like this.
Yeah, I agree with this in this specific instance, hence the “at least do a lot of straussian reading”, part. I do think that there is a spectrum from radical honesty to straussian reading to something that looks like this, and that I do think it makes sense to consider the straussian case in many situations.
I think this is a good place for this post (especially because it respond directly to Eliezer, but even if it didn’t), and I would like to read more such posts here, and more posts related to Wokeness (as long as they meet the usual standard for posts here, like for any other topic). I don’t know if I want to start another “politics on LW” discussion, because there were already so many and many of them were good, but I’ll echo again the feeling that this distancing from politics often interferes with truth-seeking rather than helping it.
Is that a different relationship to the “rationality community” or just to Eliezer?
(I also maybe should mention that I don’t mind this (i.e., Zack’s) post.)
I think it changed my relationship to the community overall. In that relationship, Eliezer is actually still the single person I would probably trust the most, despite posts like this.
Um—why? I guess I can think of some mechanisms by which this would make you trust the remaining community less, but none that seem convincing. And I guess I don’t personally have the impression that the community is deteriorating.
I mean, evidence of an instance is evidence about the class. Also, again, I think all of these updates were pretty overdetermined, and my previous relationship to community leadership was quite naive and overly hopeful. I also don’t think the community is deteriorating and never intended to say anything like that! I think this is a normal relationship to have to the community and it’s leadership.
Do you mean Zack’s post or Eliezer’s post?
Zack’s post. (Can see how that was unclear; I’ve mentioned it because I’ve been on record saying that some things shouldn’t be on LW in the past.)