I don’t see why “rationalism” would be a good thing to organize around; but I don’t think that’s what Eliezer is talking about. As cousin_it noted, Eliezer is implying that rationalism implies altruism. Should we add altruism to the bundle of extra-rational values that Eliezer thinks are part of rationalism? Combined with his insistence that “rationalists always win”, and his earlier comment that a Bayesian master would place inherent value on rationality, that would make 3 irrational elements of Yudkowskian rationalism.
Eliezer’s search for a “rationalist gestalt” that can be a lifestyle, rather than just a tool for thinking, probably has a lot to do with the accusations of cultism that he is rightly concerned about. The one sacred rule of rationalism is that you not make it sacred.
Rationality is sometimes equated with altruism, liberalism, and egalitarianism, when actually those are just historically-contingent alliances. (This is important when addressing the important charges made against rationalism by, say, Nietzsche, or Allan Bloom, who say rationalism ⇒ egalitarianism ⇒ utility placed on mean rather than maximum values ⇒ crappy art. Basically, the charge against rationalism is really against egalitarianism, but it’s more sexy and socially acceptable to say you’re attacking rationalism. But that’s a subject for another post.)
As cousin_it noted, Eliezer is implying that rationalism implies altruism
As usual, I note once again that Phil Goetz, as on virtually every occasion when he describes me as “seeming” to possess some opinion, is attacking up the wrong straw tree.
As usual, I note once again that Eliezer merely denies my reasonable interpretations of his writing, without any specifics or any explanation.
This post by Eliezer assumes that rationalists want to evangelize non-rationalists, and that they want to join together to do “all the work that needs doing to fix up this world.” If Eliezer believes something different, he could explain why what he wrote sounds the way it sounds, instead of making yet another baseless snide comment about me. His practice of issuing long pronouncements and then labeling people as “getting it” or “not getting it” calls to mind a priest more than a scientist.
… or you could take a minute to think what he might mean. What I came up with in a few seconds is:
“Most people are altruistic to some extent. However, altruism is a tricky problem—most people are not particularly effective at it. Since altruism is common, many rationalists are altruistic, and they will want to do better. This will take some effort.”
I don’t see why “rationalism” would be a good thing to organize around; but I don’t think that’s what Eliezer is talking about. As cousin_it noted, Eliezer is implying that rationalism implies altruism. Should we add altruism to the bundle of extra-rational values that Eliezer thinks are part of rationalism? Combined with his insistence that “rationalists always win”, and his earlier comment that a Bayesian master would place inherent value on rationality, that would make 3 irrational elements of Yudkowskian rationalism.
Eliezer’s search for a “rationalist gestalt” that can be a lifestyle, rather than just a tool for thinking, probably has a lot to do with the accusations of cultism that he is rightly concerned about. The one sacred rule of rationalism is that you not make it sacred.
Rationality is sometimes equated with altruism, liberalism, and egalitarianism, when actually those are just historically-contingent alliances. (This is important when addressing the important charges made against rationalism by, say, Nietzsche, or Allan Bloom, who say rationalism ⇒ egalitarianism ⇒ utility placed on mean rather than maximum values ⇒ crappy art. Basically, the charge against rationalism is really against egalitarianism, but it’s more sexy and socially acceptable to say you’re attacking rationalism. But that’s a subject for another post.)
As usual, I note once again that Phil Goetz, as on virtually every occasion when he describes me as “seeming” to possess some opinion, is attacking up the wrong straw tree.
As usual, I note once again that Eliezer merely denies my reasonable interpretations of his writing, without any specifics or any explanation.
This post by Eliezer assumes that rationalists want to evangelize non-rationalists, and that they want to join together to do “all the work that needs doing to fix up this world.” If Eliezer believes something different, he could explain why what he wrote sounds the way it sounds, instead of making yet another baseless snide comment about me. His practice of issuing long pronouncements and then labeling people as “getting it” or “not getting it” calls to mind a priest more than a scientist.
… or you could take a minute to think what he might mean. What I came up with in a few seconds is:
“Most people are altruistic to some extent. However, altruism is a tricky problem—most people are not particularly effective at it. Since altruism is common, many rationalists are altruistic, and they will want to do better. This will take some effort.”
How did I do, EY?