Committed not to Eliezer’s insights but to exaggerated versions of his blind spots
My guess would be that this is an attempt to apply a general critique of what tends to happen in community’s in general to the LW community without accounting for its specifics.
Most people in the LW community would say that Eliezer is overconfident or even arrogant (sorry Eliezer!).
The incentive gradient for status hungry folk is not to double-down on Eliezer’s views, but to double-down on your idiosyncratic version of rationalism, different enough from the community’s to be interesting, but similar enough to be legible.
(Also, I strongly recommend the post this is replying to. I was already aware that discourse functioned in the way described, but it helped me crystallised some of the phenomena much more clearly).
I’m thinking of cases like Eliezer’s Politics is the Mind-Killer, which makes the relatively narrow claim that politically loaded examples are bad examples for illustrating principles of rationality in the context of learning and teaching those principles, so they should be avoided when a less politicized alternative is available. I think this falsely assumes that it’s feasible under current circumstances for some facts to be apolitical in the absence of an active, political defense of the possibility of apolitical speech. But that’s a basically reasonable and sane mistake to make. Then I see LessWrongers proceed as though Politics is the Mind-Killer established canonically that it is bad to mention when someone is saying or doing something politically loaded, which interferes with the sort of defense that Politics is the Mind-Killer implicitly assumed was a solved problem.
Or how Eliezer both explicitly wrote at length against treating intellectual authorities as specially entitled to opinions AND played with themes of being an incomprehensibly powerful optimization process, but the LessWrong community ended up crystallizing around an exaggerated version of the latter while mostly ignoring his explicit warnings against authority-based reasoning. Eliezer’s personally commented on this (higher-context link that may take longer to load):
“How dare you think that you’re better at meta-rationality than Eliezer Yudkowsky, do you think you’re special”—is somebody trolling? Have they never read anything I’ve written in my entire life? Do they have no sense, even, of irony? Yeah, sure, it’s harder to be better at some things than me, sure, somebody might be skeptical about that, but then you ask for evidence or say “Good luck proving that to us all eventually!” You don’t be like, “Do you think you’re special?” What kind of bystander-killing argumentative superweapon is that? What else would it prove?
I really don’t know how I could make this any clearer. I wrote a small book whose second half was about not doing exactly this. I am left with a sense that I really went to some lengths to prevent this, I did what society demands of a person plus over 10,000% (most people never write any extended arguments against bad epistemology at all, and society doesn’t hold that against them), I was not subtle. At some point I have to acknowledge that other human beings are their own people and I cannot control everything they do—and I hope that others will also acknowledge that I cannot avert all the wrong thoughts that other people think, even if I try, because I sure did try. A lot. Over many years. Aimed at that specific exact way of thinking. People have their own wills, they are not my puppets, they are still not my puppets even if they have read some blog posts of mine or heard summaries from somebody else who once did; I have put in at least one hundred times the amount of effort that would be required, if any effort were required at all, to wash my hands of this way of thinking.
My guess would be that this is an attempt to apply a general critique of what tends to happen in community’s in general to the LW community without accounting for its specifics.
Most people in the LW community would say that Eliezer is overconfident or even arrogant (sorry Eliezer!).
The incentive gradient for status hungry folk is not to double-down on Eliezer’s views, but to double-down on your idiosyncratic version of rationalism, different enough from the community’s to be interesting, but similar enough to be legible.
(Also, I strongly recommend the post this is replying to. I was already aware that discourse functioned in the way described, but it helped me crystallised some of the phenomena much more clearly).
I’m thinking of cases like Eliezer’s Politics is the Mind-Killer, which makes the relatively narrow claim that politically loaded examples are bad examples for illustrating principles of rationality in the context of learning and teaching those principles, so they should be avoided when a less politicized alternative is available. I think this falsely assumes that it’s feasible under current circumstances for some facts to be apolitical in the absence of an active, political defense of the possibility of apolitical speech. But that’s a basically reasonable and sane mistake to make. Then I see LessWrongers proceed as though Politics is the Mind-Killer established canonically that it is bad to mention when someone is saying or doing something politically loaded, which interferes with the sort of defense that Politics is the Mind-Killer implicitly assumed was a solved problem.
Or how Eliezer both explicitly wrote at length against treating intellectual authorities as specially entitled to opinions AND played with themes of being an incomprehensibly powerful optimization process, but the LessWrong community ended up crystallizing around an exaggerated version of the latter while mostly ignoring his explicit warnings against authority-based reasoning. Eliezer’s personally commented on this (higher-context link that may take longer to load):
Or how Eliezer wrote about how modern knowledge work has become harmfully disembodied and dissociated from physical reality—going into detail about how running from a tiger engages your whole sensorimotor system in a way that staring at a computer screen doesn’t—but lots of Lesswrongers seem to endorse and even celebrate this very dissociation from physical reality in practice.