One criticism, why bring up Republicans, I’m not even a Republican and I sort of recoiled at that part.
Agreed. Also not a Republican (or American, for that matter), but that was a bit off putting. To quote Eliezer himself:
In Artificial Intelligence, and particularly in the domain of nonmonotonic reasoning, there’s a standard problem: “All Quakers are pacifists. All Republicans are not pacifists. Nixon is a Quaker and a Republican. Is Nixon a pacifist?”
What on Earth was the point of choosing this as an example? To rouse the political emotions of the readers and distract them from the main question? To make Republicans feel unwelcome in courses on Artificial Intelligence and discourage them from entering the field?
Yeah, I was thinking about exactly the same quote. Is this what living in Bay Area for too long does to people?
How about using an example of a Democrat who insists that logic is colonialistic and oppressive; Aumann’s agreement theorem is wrong because Aumann was a white male; and the AI should never consider itself smarter than an average human, because doing so would be sexist and racist (and obviously also islamophobic if the AI concludes that there are no gods). What arguments could Eliezer give to zir? For bonus points, consider that any part of the reply would be immediately taken out of context and shared on Twitter.
Okay, I’ll stop here.
For the record, otherwise this is a great article!
Same here. But—returning to the topic of the article—anything written by humans is in the same reference class, therefore outside view suggests that one should ignore all arguments made by humans, ever. And anything you might say in defense of your arguments is merely an inside view, therefore less trustworthy. I mean, the strawman example in my comment could also provide special arguments in support of their agument, which only shows that “special arguments” are better ignored.
At the end you are left with: “I believe my opinions are better than yours, because it’s obvious given my meta-opinions. And I believe my meta-opinions are better than yours, because it’s obvious given my meta-meta-opinions. And I believe my meta-meta-opinions are better than your, because… uhm, this is getting too abstract now, but I simply believe that this is how it is.” Again, not very convincing from the outside view. :D
Agreed. Also not a Republican (or American, for that matter), but that was a bit off putting. To quote Eliezer himself:
Yeah, I was thinking about exactly the same quote. Is this what living in Bay Area for too long does to people?
How about using an example of a Democrat who insists that logic is colonialistic and oppressive; Aumann’s agreement theorem is wrong because Aumann was a white male; and the AI should never consider itself smarter than an average human, because doing so would be sexist and racist (and obviously also islamophobic if the AI concludes that there are no gods). What arguments could Eliezer give to zir? For bonus points, consider that any part of the reply would be immediately taken out of context and shared on Twitter.
Okay, I’ll stop here.
For the record, otherwise this is a great article!
I think the only argument I could give is a facepalm.
Same here. But—returning to the topic of the article—anything written by humans is in the same reference class, therefore outside view suggests that one should ignore all arguments made by humans, ever. And anything you might say in defense of your arguments is merely an inside view, therefore less trustworthy. I mean, the strawman example in my comment could also provide special arguments in support of their agument, which only shows that “special arguments” are better ignored.
At the end you are left with: “I believe my opinions are better than yours, because it’s obvious given my meta-opinions. And I believe my meta-opinions are better than yours, because it’s obvious given my meta-meta-opinions. And I believe my meta-meta-opinions are better than your, because… uhm, this is getting too abstract now, but I simply believe that this is how it is.” Again, not very convincing from the outside view. :D