One reason I have respect for Eliezer is HPMOR—there’s a huge amount of fan fiction, and writing something which impresses both a lot of people who like fan fiction and a lot of people who don’t like fan fiction is no small achievement.
Also, it’s the only story I know of which gets away with such huge shifts in emotional tone. (This may be considered a request recommendations of other comparable works.)
Furthermore, Eliezer has done a good bit to convince people to think clearly about what they’re doing, and sometimes even to make useful changes in their lives as a result.
I’m less sure that he’s right about FAI, but those two alone are enough to make for respect.
Eliezer has done a good bit to convince people to think clearly about what they’re doing
This is a source of disagreement. Think cleary and change behavior” is not a good slogan, is used for numerous groups. But—and the inferential distance with is not clear from begining—there are lateral beliefs: computational epistemology, especificity, humans as impefect machines etc.
In a broad context, even education in general could fit this phrase, specially for people with no training in gathering data.
One reason I have respect for Eliezer is HPMOR—there’s a huge amount of fan fiction, and writing something which impresses both a lot of people who like fan fiction and a lot of people who don’t like fan fiction is no small achievement.
Also, it’s the only story I know of which gets away with such huge shifts in emotional tone. (This may be considered a request recommendations of other comparable works.)
Furthermore, Eliezer has done a good bit to convince people to think clearly about what they’re doing, and sometimes even to make useful changes in their lives as a result.
I’m less sure that he’s right about FAI, but those two alone are enough to make for respect.
In the context of LessWrong and FAI, Yudkowsky’s fiction writing abilities are almost entirely irrelevant.
This is a source of disagreement. Think cleary and change behavior” is not a good slogan, is used for numerous groups. But—and the inferential distance with is not clear from begining—there are lateral beliefs: computational epistemology, especificity, humans as impefect machines etc.
In a broad context, even education in general could fit this phrase, specially for people with no training in gathering data.