I’ve finally figured out why Eliezer was popular. He isn’t the best writer, or the smartest writer, or the best writer for smart people, but he’s the best writer for people who identify with being smart. This opportunity still seems open today, despite tons of rational fiction being written, because its authors are more focused on showing how smart they are, instead of playing on the self-identification of readers as Eliezer did.
It feels like you could do the same trick for people who identify with being kind, or brave, or loving, or individualist, or belonging to a particular nation… Any trait that they secretly feel might be undervalued by the world. Just fan it up and make it sound like the most important quality in a person. I wonder how many writers do it consciously.
Sequences also contain criticism, towards smart people who are smart in the wrong way (“clever arguers”), and even towards smart people in general (“why our kind can’t cooperate”).
Making smartness sound like the most important thing gives you Mensa or RationalWiki; you also need harsh lessons on how to do it right to create Less Wrong. Maybe so harsh that most people who identify with being X will actually turn against you, because you assigned low status to their way of doing X.
And by the way, effective altruism is already using this strategy in the field of… well, altruism.
Not an OP, but I suspect the parts that rub many people wrong are the following:
Quantum physics sequence; specifically that Eliezer claims to know the right answer to which interpretation of quantum physics is scientific, despite the professional quantum physicists can’t all agree on the same answer. (“I am so smart I know science better than the best scientists in the world, despite being a high-school dropout.”)
Dismissing religion. (“I am so smart I know for sure that billions of people are wrong, including the theologists who spent their whole lives studying religion.”)
The whole “sense that more is possible” approach. (Feels like bragging about abilities of you and your imaginary tribe of smart people. Supported by the fictional evidence of the Beisutsukai superheroes, to illustrate how high you think about yourself.)
I guess people with different attitude will see the relative importance of these parts differently. If you start reading the book already not believing in supernatural and not being emotionally invested in quantum physics, you will be like: “Supernatural is not-even-wrong? Yeah. Many worlds? I guess this is what biting the bullet really means, huh. Could we do better? Yeah, that’s a nice dream, and perhaps not completely impossible.” And then you focus on the parts of how to avoid specific mistakes.
But if you start reading the book believing strongly in supernatural, or Copenhagen interpretation, or that nerds are inherently and irreparably losers, you will probably be like: “Oh, this guy is so wrong. And so overconfident. Oh, please someone slap him already to remind him of his real status, because this is so embarrassing. Jesus, this low-status nerd is now surrounded by other low-status nerds who worship him. What a cringe-fest!”
So different people can come with completely different interpretations of what the Sequences are actually about. If you dismiss all the specific advice, it seems like a status move, because when people write books about “other people are wrong, and I am right”, it usually is a status move.
I agree with these examples, but cousin_it said specifically that
its authors are more focused on showing how smart they are, instead of playing on the self-identification of readers as Eliezer did
and these examples all seem to be more “Eliezer showing off how smart he is” rather than “Eliezer making his readers feels smart”.
Though now that’s it been pointed out, I agree that there’s a sense of Eliezer also doing the latter, and doing more of it than the average focused-on-the-former writer… but this distinction seems a little fuzzy to me and it’s not entirely clear to me what the specific things that he does are.
I’ve finally figured out why Eliezer was popular. He isn’t the best writer, or the smartest writer, or the best writer for smart people, but he’s the best writer for people who identify with being smart. This opportunity still seems open today, despite tons of rational fiction being written, because its authors are more focused on showing how smart they are, instead of playing on the self-identification of readers as Eliezer did.
It feels like you could do the same trick for people who identify with being kind, or brave, or loving, or individualist, or belonging to a particular nation… Any trait that they secretly feel might be undervalued by the world. Just fan it up and make it sound like the most important quality in a person. I wonder how many writers do it consciously.
Sequences also contain criticism, towards smart people who are smart in the wrong way (“clever arguers”), and even towards smart people in general (“why our kind can’t cooperate”).
Making smartness sound like the most important thing gives you Mensa or RationalWiki; you also need harsh lessons on how to do it right to create Less Wrong. Maybe so harsh that most people who identify with being X will actually turn against you, because you assigned low status to their way of doing X.
And by the way, effective altruism is already using this strategy in the field of… well, altruism.
Can you give specific examples of him doing that?
Not an OP, but I suspect the parts that rub many people wrong are the following:
Quantum physics sequence; specifically that Eliezer claims to know the right answer to which interpretation of quantum physics is scientific, despite the professional quantum physicists can’t all agree on the same answer. (“I am so smart I know science better than the best scientists in the world, despite being a high-school dropout.”)
Dismissing religion. (“I am so smart I know for sure that billions of people are wrong, including the theologists who spent their whole lives studying religion.”)
The whole “sense that more is possible” approach. (Feels like bragging about abilities of you and your imaginary tribe of smart people. Supported by the fictional evidence of the Beisutsukai superheroes, to illustrate how high you think about yourself.)
I guess people with different attitude will see the relative importance of these parts differently. If you start reading the book already not believing in supernatural and not being emotionally invested in quantum physics, you will be like: “Supernatural is not-even-wrong? Yeah. Many worlds? I guess this is what biting the bullet really means, huh. Could we do better? Yeah, that’s a nice dream, and perhaps not completely impossible.” And then you focus on the parts of how to avoid specific mistakes.
But if you start reading the book believing strongly in supernatural, or Copenhagen interpretation, or that nerds are inherently and irreparably losers, you will probably be like: “Oh, this guy is so wrong. And so overconfident. Oh, please someone slap him already to remind him of his real status, because this is so embarrassing. Jesus, this low-status nerd is now surrounded by other low-status nerds who worship him. What a cringe-fest!”
So different people can come with completely different interpretations of what the Sequences are actually about. If you dismiss all the specific advice, it seems like a status move, because when people write books about “other people are wrong, and I am right”, it usually is a status move.
I agree with these examples, but cousin_it said specifically that
and these examples all seem to be more “Eliezer showing off how smart he is” rather than “Eliezer making his readers feels smart”.
Though now that’s it been pointed out, I agree that there’s a sense of Eliezer also doing the latter, and doing more of it than the average focused-on-the-former writer… but this distinction seems a little fuzzy to me and it’s not entirely clear to me what the specific things that he does are.