Furthermore, I agree with every essay I’ve ever read by Yvain, I use “believe whatever gwern believes” as a heuristic/algorithm for generating true beliefs, and don’t disagree with anything I’ve ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai;
Wow. One of these is not like the others! (Hint: all but one have karma > 10,000.)
In all seriousness, being placed in that group has to count as one of the greatest honors of my internet life.
So I suppose I can’t be totally objective when I sing the praises of this post. Nonetheless, it is a fact that I was planning to voice my agreement well before I reached the passage quoted above. So, let me confirm that I, too, “stand by” the Sequences (excepting various quibbles which are of scant relevance in this context).
I’ll go further and note that I am significantly less impressed than most of LW by Holden Karnofsky’s critique of SI, and suspect that the extraordinary affective glow being showered upon it is mostly the result of Holden’s affiliation with GiveWell. Of course, that affective glow is so luminous (the post is at, what, like 200 now?) that to say I’m less impressed than everyone else isn’t really to say much at all, and indeed I agree that Holden’s critique was constructive and thoughtful (certainly by the standards of “the outside world”, i.e. people who aren’t LW regulars or otherwise thoroughly “infected” by the memes here). I just don’t think it was particularly original -- similar points were made in the past by people like multifoliaterose and XiXDu (not to mention Wei Dai, etc.) -- and nor do I think it is particularly correct.
(To give one example, it’s pretty clear to me that “Tool AI” is Oracle AI for relevant purposes, and I don’t understand why this isn’t clear to Holden also. One of the key AI-relevant lessons from the Sequences is that an AI should be thought of as an efficient cross-domain optimization process, and that the danger is inherent in the notion of “efficient optimization” itself, rather than residing in any anthropomorphic “agency” properties that the AI may or may not have.)
By the way, for all that I may increasingly sound like a Yudkowsky/SI “cultist” (which may perhaps have contributed to my inclusion in the distinguished list referenced above!), I still have a very hard time thinking of myself that way. In fact, I still feel like something of an outsider, because I didn’t grow up on science fiction, was never on the SL4 mailing list, and indeed had never even heard of the “technological singularity” before I started reading Overcoming Bias sometime around 2006-07.
(Of course, given that Luke went from being a fundamentalist Christian to running the Singularity Institute in less time than I’ve been reading Yudkowsky, perhaps it’s time for me to finally admit that I too have joined the club.)
If you do PM the people you like and ask them who they like, I’d like to know. The list of people you take seriously is highly correlated with the list of people I take seriously.
Wow. One of these is not like the others! (Hint: all but one have karma > 10,000.)
In all seriousness, being placed in that group has to count as one of the greatest honors of my internet life.
So I suppose I can’t be totally objective when I sing the praises of this post. Nonetheless, it is a fact that I was planning to voice my agreement well before I reached the passage quoted above. So, let me confirm that I, too, “stand by” the Sequences (excepting various quibbles which are of scant relevance in this context).
I’ll go further and note that I am significantly less impressed than most of LW by Holden Karnofsky’s critique of SI, and suspect that the extraordinary affective glow being showered upon it is mostly the result of Holden’s affiliation with GiveWell. Of course, that affective glow is so luminous (the post is at, what, like 200 now?) that to say I’m less impressed than everyone else isn’t really to say much at all, and indeed I agree that Holden’s critique was constructive and thoughtful (certainly by the standards of “the outside world”, i.e. people who aren’t LW regulars or otherwise thoroughly “infected” by the memes here). I just don’t think it was particularly original -- similar points were made in the past by people like multifoliaterose and XiXDu (not to mention Wei Dai, etc.) -- and nor do I think it is particularly correct.
(To give one example, it’s pretty clear to me that “Tool AI” is Oracle AI for relevant purposes, and I don’t understand why this isn’t clear to Holden also. One of the key AI-relevant lessons from the Sequences is that an AI should be thought of as an efficient cross-domain optimization process, and that the danger is inherent in the notion of “efficient optimization” itself, rather than residing in any anthropomorphic “agency” properties that the AI may or may not have.)
By the way, for all that I may increasingly sound like a Yudkowsky/SI “cultist” (which may perhaps have contributed to my inclusion in the distinguished list referenced above!), I still have a very hard time thinking of myself that way. In fact, I still feel like something of an outsider, because I didn’t grow up on science fiction, was never on the SL4 mailing list, and indeed had never even heard of the “technological singularity” before I started reading Overcoming Bias sometime around 2006-07.
(Of course, given that Luke went from being a fundamentalist Christian to running the Singularity Institute in less time than I’ve been reading Yudkowsky, perhaps it’s time for me to finally admit that I too have joined the club.)
There are many others, as well, but a full list seemed like an extremely terrible idea.
Though I’d like a post that encouraged people to make such lists in the comments, so I could figure out who the people I like like.
You could create such a post.
Alternatively, you could PM the people you like and ask them whom they like.
Extreme nitpick: “PM the people you like” is not the converse of “create a post”.
You are entirely correct. Edited.
If you do PM the people you like and ask them who they like, I’d like to know. The list of people you take seriously is highly correlated with the list of people I take seriously.