How would we go about disincentivizing this drift towards undesirable social norms?
Perhaps it could be useful if we had some high-status members in the community, who would sometimes very visibly do something non-rational, non-effective, non-altruist, just because it is fun for them.
As an extreme thought experiment, imagine Eliezer Yudkowsky writing and publishing fan fiction. LOL
That made me chuckle. Or writing some of the funniest philosophical humour I’ve read.
I don’t understand the view that “rationalists” are emotionless and incapable of appreciating aesthetics. I haven’t seen much evidence to back this claim, mere anecdotes. If anything, people who see reality more clearly can see more of its beauty. As Feynman put it, a scientist can see more beauty in the world than an artist because the scientist can see the surface level beauty as well as the beauty in the layers of abstraction all the way down to fundamental physics.
If someone consistently fails to achieve their instrumental goals by adhering too firmly to some rigid and unreasonable notion of “rationality”, then what they think rationality is must be wrong/incomplete.
Downvoted. Do you actually consider HPMOR non-rational and non-effective? It isn’t just fan fiction, it’s a tiny layer of fan fiction wrapped around the Sequences. Judging from the numerous comments in the Open Threads starting with “I’ve discovered LW through HPMOR”, I think we could argue that HPMOR was more effective than the Sequences themselves (at least with respect to the goal of creating more aspiring rationalists).
More generally, every single piece of fiction written by EY that I’ve read so far involves very rational characters doing very rational things, and that’s kind of the point. No one is saying that you shouldn’t write fiction in general, but I do say that you shouldn’t stop being rational while writing fiction. Or poetry. A rationalist poet should lean toward didactic poetry or the like (at least, that’s what I would do). I am probably biased against silly poetry in general, but I personally regard writing dumb verses as I regard eating unhealthy cookies… do it if you need them to have fun, but you shouldn’t be proud of this.
More generally, every single piece of fiction written by EY that I’ve read so far involves very rational characters doing very rational things, and that’s kind of the point. No one is saying that you shouldn’t write fiction in general, but I do say that you shouldn’t stop being rational while writing fiction.
This feels to me like a goalpost being moved.
Yes, Eliezer’s characters do smart things, but the likely reason is that he likes writing them that way, the audience enjoys reading that, and he has a comparative advantage doing that. (Kinda like Dick Francis writes about horse racing all the time.)
And I guess HPMOR really is “the Sequences for people who wouldn’t read the Sequences otherwise”. But was is also strategically necessary to write this or this? New audience, perhaps, but strongly diminishing returns.
The original objection was that rationalists and effective altruists feel like they are not allowed to do things that are not optimal (fully rational, or fully altruistic). Writing HPMOR could have been an optimal move for Eliezer, but the following stories probably were not. They are better explained by a hypothesis that Eliezer enjoys writing.
Perhaps it could be useful if we had some high-status members in the community, who would sometimes very visibly do something non-rational, non-effective, non-altruist, just because it is fun for them.
As an extreme thought experiment, imagine Eliezer Yudkowsky writing and publishing fan fiction. LOL
That made me chuckle. Or writing some of the funniest philosophical humour I’ve read.
I don’t understand the view that “rationalists” are emotionless and incapable of appreciating aesthetics. I haven’t seen much evidence to back this claim, mere anecdotes. If anything, people who see reality more clearly can see more of its beauty. As Feynman put it, a scientist can see more beauty in the world than an artist because the scientist can see the surface level beauty as well as the beauty in the layers of abstraction all the way down to fundamental physics.
If someone consistently fails to achieve their instrumental goals by adhering too firmly to some rigid and unreasonable notion of “rationality”, then what they think rationality is must be wrong/incomplete.
Downvoted. Do you actually consider HPMOR non-rational and non-effective? It isn’t just fan fiction, it’s a tiny layer of fan fiction wrapped around the Sequences. Judging from the numerous comments in the Open Threads starting with “I’ve discovered LW through HPMOR”, I think we could argue that HPMOR was more effective than the Sequences themselves (at least with respect to the goal of creating more aspiring rationalists).
More generally, every single piece of fiction written by EY that I’ve read so far involves very rational characters doing very rational things, and that’s kind of the point. No one is saying that you shouldn’t write fiction in general, but I do say that you shouldn’t stop being rational while writing fiction. Or poetry. A rationalist poet should lean toward didactic poetry or the like (at least, that’s what I would do). I am probably biased against silly poetry in general, but I personally regard writing dumb verses as I regard eating unhealthy cookies… do it if you need them to have fun, but you shouldn’t be proud of this.
This feels to me like a goalpost being moved.
Yes, Eliezer’s characters do smart things, but the likely reason is that he likes writing them that way, the audience enjoys reading that, and he has a comparative advantage doing that. (Kinda like Dick Francis writes about horse racing all the time.)
And I guess HPMOR really is “the Sequences for people who wouldn’t read the Sequences otherwise”. But was is also strategically necessary to write this or this? New audience, perhaps, but strongly diminishing returns.
The original objection was that rationalists and effective altruists feel like they are not allowed to do things that are not optimal (fully rational, or fully altruistic). Writing HPMOR could have been an optimal move for Eliezer, but the following stories probably were not. They are better explained by a hypothesis that Eliezer enjoys writing.