But if it comes down to Us or Them, I’m with Them. You have been warned.
That’s from the document where Yudkowsky described his “transfer of allegence”.
What puzzles me is how the outfit gets any support. I mean, they are a secretive, closed-source machine intelligence outfit who makes no secret of their plan to take over the world. To me, that is like writing BAD GUY in big, black letters on your forehead.
The “He-he—let’s construct machine intelligence in our basement” is like something out of Tin-Tin.
Maybe the way to understand the phenomenon is as a personality cult.
That’s how it strikes me also. To me Yudkowsky has most of the traits of a megalomaniacal supervillain, but I don’t hold that against him. I will give LessWrong this much credit: they still allow me to post here, unlike Anissimov who simply banned me outright from his blog.
I’m pretty sure Eliezer is consciously riffing on some elements of the megalomaniacal supervillain archetype; at the very least, he name-checks the archetype here and here in somewhat favorable terms. There are any number of reasons why he might be doing so, ranging from pretty clever memetic engineering to simply thinking it’s fun or cool. As you might be implying, though, that doesn’t make him megalomaniacal or a supervillain; we live in a world where bad guys aren’t easily identified by waxed mustaches and expansive mannerisms.
Good thing, too; I lost my goatee less than a year ago.
I expect it helps to have your content come up first—if people search for your name and the word “supervillain”. Currently 3 of the top 4 posts with those search terms are E.Y. posts.
That’s from the document where Yudkowsky described his “transfer of allegence”.
What puzzles me is how the outfit gets any support. I mean, they are a secretive, closed-source machine intelligence outfit who makes no secret of their plan to take over the world. To me, that is like writing BAD GUY in big, black letters on your forehead.
The “He-he—let’s construct machine intelligence in our basement” is like something out of Tin-Tin.
Maybe the way to understand the phenomenon is as a personality cult.
What. That quote seems to be directly at odds with the entire idea of “Friendly AI”. And of course it is, as a later version of Eliezer refuted it:
I’m also not sure it makes sense to call SIAI a “closed-source” machine intelligence outfit, given that I’m pretty sure there’s no code yet.
WTF? It says right at the top of the page:
That’s how it strikes me also. To me Yudkowsky has most of the traits of a megalomaniacal supervillain, but I don’t hold that against him. I will give LessWrong this much credit: they still allow me to post here, unlike Anissimov who simply banned me outright from his blog.
I’m pretty sure Eliezer is consciously riffing on some elements of the megalomaniacal supervillain archetype; at the very least, he name-checks the archetype here and here in somewhat favorable terms. There are any number of reasons why he might be doing so, ranging from pretty clever memetic engineering to simply thinking it’s fun or cool. As you might be implying, though, that doesn’t make him megalomaniacal or a supervillain; we live in a world where bad guys aren’t easily identified by waxed mustaches and expansive mannerisms.
Good thing, too; I lost my goatee less than a year ago.
I expect it helps to have your content come up first—if people search for your name and the word “supervillain”. Currently 3 of the top 4 posts with those search terms are E.Y. posts.
Since the quote is obsolete, as nhamann pointed out and as it says right on the top of the page, maybe you are being struck wrong.