I’d guess he wants to create FAI quickly because, among other things, ~150000 people are dying each day. And secretely because there are people who would build and run an UFAI without regard for the consequences and therefore sharing knowledge with them is a bad idea.
I think your guesses as to the rationalizations that would be offered are right on the mark.
Putting aside whether this is a rationalization for hidden other reasons, do you think this justification is a valid argument? Do you think it’s strong enough? If not, why not? And if so, why should it matter if there are other reasons too?
A question. What fraction of those ~150000 people per day fall into the category of “people who would build and run an UFAI without regard for the consequences”?
Another question: At what stage of the process does sharing knowledge with these people become a good idea?
… why should it matter if there are other reasons too?
Tell me what those other reasons are, and maybe I can answer you.
Do you think this justification is wrong because you don’t think 1.5*10^5 deaths per day are a huge deal, or because you don’t think constructing an FAI in secretis the best way to stop them?
Both. Though actually, I didn’t say the justification was wrong. I said it was bullshit. It is offered only to distract oneself and to distract others.
Is it really possible that you don’t see this choice of justification as manipulative? Is it possible that being manipulated does not make you angry?
You’re discounting the reasoning showing that Eliezer’s behavior is consistent with him being a good guy and claiming that it is merely a distraction. You haven’t justified those statements—they are supposed to be “obvious”.
What do you think you know and how do you think you know it? You make statements about the real motivations of Eliezer Yudkowsky. Do you know how you have arrived at those beliefs?
You’re discounting the reasoning showing that Eliezer’s behavior is consistent with him being a good guy
I don’t recall seeing any such reasoning.
You make statements about the real motivations of Eliezer Yudkowsky.
Did I? Where? What I am pretty sure I have expressed is that I distrust all self-serving claims about real motivations. Nothing personal—I tend to mistrust all claims of benevolence from powerful individuals, whether they be religious leaders, politicians, or fiction writers. Since Eliezer fits all three categories, he gets some extra scrutiny.
Putting aside whether this is a rationalization for hidden other reasons, do you think this justification is a valid argument? Do you think it’s strong enough? If not, why not? And if so, why should it matter if there are other reasons too?
I think those reasons are transparent bullshit.
A question. What fraction of those ~150000 people per day fall into the category of “people who would build and run an UFAI without regard for the consequences”?
Another question: At what stage of the process does sharing knowledge with these people become a good idea?
Tell me what those other reasons are, and maybe I can answer you.
Do you think this justification is wrong because you don’t think 1.5*10^5 deaths per day are a huge deal, or because you don’t think constructing an FAI in secretis the best way to stop them?
Both. Though actually, I didn’t say the justification was wrong. I said it was bullshit. It is offered only to distract oneself and to distract others.
Is it really possible that you don’t see this choice of justification as manipulative? Is it possible that being manipulated does not make you angry?
You’re discounting the reasoning showing that Eliezer’s behavior is consistent with him being a good guy and claiming that it is merely a distraction. You haven’t justified those statements—they are supposed to be “obvious”.
What do you think you know and how do you think you know it? You make statements about the real motivations of Eliezer Yudkowsky. Do you know how you have arrived at those beliefs?
I don’t recall seeing any such reasoning.
Did I? Where? What I am pretty sure I have expressed is that I distrust all self-serving claims about real motivations. Nothing personal—I tend to mistrust all claims of benevolence from powerful individuals, whether they be religious leaders, politicians, or fiction writers. Since Eliezer fits all three categories, he gets some extra scrutiny.