After writing this, not sure if I endorse this whole sentiment. To elaborate: it sounds to me like “pseudo-rationality” is just being badat rationality, and if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality. So there’s only a second-order difference between optimizing for pseudo-rationality and optimizing for rationality, and your post sort of just sounds like criticizing people for being bad rationalists in an unproductive tone.
There’s a flavor of pseudo-rationality which is about optimizing for social approval from other pseudo-rationalists, e.g. trying to write LW posts by mimicking Eliezer’s writing style or similar.
if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality.
I think this is false: even if your final goal is to optimize for social status in the community, real rationality would still force you to locally give it up because of convergent instrumental goals. There is in fact a significant first order difference.
One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who’s faking it. So in fact you won’t be socially respected above a certain level until you optimize hard for altruistic goals.
Another example is that whatever your goals are, in the long run you’ll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.
After writing this, not sure if I endorse this whole sentiment. To elaborate: it sounds to me like “pseudo-rationality” is just being bad at rationality, and if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality. So there’s only a second-order difference between optimizing for pseudo-rationality and optimizing for rationality, and your post sort of just sounds like criticizing people for being bad rationalists in an unproductive tone.
There’s a flavor of pseudo-rationality which is about optimizing for social approval from other pseudo-rationalists, e.g. trying to write LW posts by mimicking Eliezer’s writing style or similar.
I think this is false: even if your final goal is to optimize for social status in the community, real rationality would still force you to locally give it up because of convergent instrumental goals. There is in fact a significant first order difference.
Can you elaborate on this? I have the feeling that I agree now but I’m not certain what I’m agreeing with.
One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who’s faking it. So in fact you won’t be socially respected above a certain level until you optimize hard for altruistic goals.
Another example is that whatever your goals are, in the long run you’ll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.