SIAI has money. Not a ton of it, but enough that they don’t have to sell shares. The AGI programmers would much, much, much sooner extrapolate only their values than accept a small extremely transitory reward in exchange for their power. Of note is that this completely and entirely goes against all the stated ethics of SIAI. However, I realize that stated ethics don’t mean much when that much power is on the line, and it would be silly to assume the opposite.
That said, this stems from your misinterpretation of the CEV document. No one has ever interpreted it the way you did. If that’s what Eliezer actually meant, then of course everyone would be freaking out about it. I would be freaking out about it. And rightfully so; such a system would be incredibly unethical. For Eliezer to simply publicly announce that he was open to bribes (or blackmail) would be incredibly stupid. Do you believe that Eliezer would so something that incredibly stupid? If not, then you misinterpreted the text. Which doesn’t mean you can’t criticize SIAI for other reasons, or speculate about the ulterior motives of the AGI researchers, but it does mean you should acknowledge that you messed up. (The downvotes alone are pretty strong evidence in that regard.)
I will note that I’m very confused by your reaction, and thus admit a strong possibility that I misunderstand you, you misunderstand me, or we misunderstand each other, in which case I doubt the above two paragraphs will help much.
For Eliezer to simply publicly announce that he was open to bribes (or blackmail) would be incredibly stupid. Do you believe that Eliezer would so something that incredibly stupid?
Of course not. But if he offers access and potentially influence in exchange for money, he is simply doing what all politicians do. What pretty much everyone does.
Eliezer was quite clear that he would do nothing that violates his own moral standards.
He was also quite clear (though perhaps joking) that he didn’t even want to continue to listen to folks who don’t pay their fair share.
Do you believe that Eliezer would so something that incredibly stupid?
Ok, I already gave that question an implicit “No” answer. But I think it also deserves an implicit “Yes”. Let me ask you: Do you think Eliezer would ever say anything off-the-cuff which shows a lack of attention to appearances that verges on stupidity?
Eliezer was quite clear that he would do nothing that violates his own moral standards. He was also quite clear (though perhaps joking) that he didn’t even want to continue to listen to folks who don’t pay their fair share.
He was quite clear that he didn’t want to continue listening to people who thought that arguing about the specific output of CEV, at the object level, was a useful activity, and that he would listen to anyone who could make substantive intellectual contributions to the actual problems at hand, regardless of their donations or lack thereof (“It goes without saying that anyone wishing to point out a problem is welcome to do so. Likewise for talking about the technical side of Friendly AI.” — the part right after the last paragraph you quoted...). You are taking a mailing list moderation experiment and blowing it way out of proportion; he was essentially saying “In my experience, this activity is fun, easy, and useless, and it is therefore tempting to do it in place of actually helping; therefore, if you want take up people’s time by doing that on SL4, my privately-operated discussion space that I don’t actually have to let you use at all if I don’t want to, then you have to agree to do something I do consider useful; if you disagree, then you can do it wherever the hell you want aside from SL4.” That’s it. Nothing there could be interpreted remotely as selling influence or even access. I’ve disputed aspects of SIAI’s PR, but I don’t even think a typical member of the public (with minimal background sufficient to understand the terms used) would read it that way.
Of course not. But if he offers access and potentially influence in exchange for money, he is simply doing what all politicians do. What pretty much everyone does.
At this point I’m sure I misunderstood you, such that any quibbles I have left are covered by other commenters. My apologies. Blame it on the oxycodone.
Do you think Eliezer would ever say anything off-the-cuff which shows a lack of attention to appearances that verges on stupidity?
OF COURSE NOT! Haven’t you read the Eliezer Yudkowsky Facts post and comments? Yeesh, newcomers these days...
wtf?
Whereas you seem to be postulating that in the absence of sums of money, the SIAI has something to sell. Nothing stupid about it. Merely desperate.
SIAI has money. Not a ton of it, but enough that they don’t have to sell shares. The AGI programmers would much, much, much sooner extrapolate only their values than accept a small extremely transitory reward in exchange for their power. Of note is that this completely and entirely goes against all the stated ethics of SIAI. However, I realize that stated ethics don’t mean much when that much power is on the line, and it would be silly to assume the opposite.
That said, this stems from your misinterpretation of the CEV document. No one has ever interpreted it the way you did. If that’s what Eliezer actually meant, then of course everyone would be freaking out about it. I would be freaking out about it. And rightfully so; such a system would be incredibly unethical. For Eliezer to simply publicly announce that he was open to bribes (or blackmail) would be incredibly stupid. Do you believe that Eliezer would so something that incredibly stupid? If not, then you misinterpreted the text. Which doesn’t mean you can’t criticize SIAI for other reasons, or speculate about the ulterior motives of the AGI researchers, but it does mean you should acknowledge that you messed up. (The downvotes alone are pretty strong evidence in that regard.)
I will note that I’m very confused by your reaction, and thus admit a strong possibility that I misunderstand you, you misunderstand me, or we misunderstand each other, in which case I doubt the above two paragraphs will help much.
Of course not. But if he offers access and potentially influence in exchange for money, he is simply doing what all politicians do. What pretty much everyone does.
Eliezer was quite clear that he would do nothing that violates his own moral standards. He was also quite clear (though perhaps joking) that he didn’t even want to continue to listen to folks who don’t pay their fair share.
Ok, I already gave that question an implicit “No” answer. But I think it also deserves an implicit “Yes”. Let me ask you: Do you think Eliezer would ever say anything off-the-cuff which shows a lack of attention to appearances that verges on stupidity?
He was quite clear that he didn’t want to continue listening to people who thought that arguing about the specific output of CEV, at the object level, was a useful activity, and that he would listen to anyone who could make substantive intellectual contributions to the actual problems at hand, regardless of their donations or lack thereof (“It goes without saying that anyone wishing to point out a problem is welcome to do so. Likewise for talking about the technical side of Friendly AI.” — the part right after the last paragraph you quoted...). You are taking a mailing list moderation experiment and blowing it way out of proportion; he was essentially saying “In my experience, this activity is fun, easy, and useless, and it is therefore tempting to do it in place of actually helping; therefore, if you want take up people’s time by doing that on SL4, my privately-operated discussion space that I don’t actually have to let you use at all if I don’t want to, then you have to agree to do something I do consider useful; if you disagree, then you can do it wherever the hell you want aside from SL4.” That’s it. Nothing there could be interpreted remotely as selling influence or even access. I’ve disputed aspects of SIAI’s PR, but I don’t even think a typical member of the public (with minimal background sufficient to understand the terms used) would read it that way.
At this point I’m sure I misunderstood you, such that any quibbles I have left are covered by other commenters. My apologies. Blame it on the oxycodone.
OF COURSE NOT! Haven’t you read the Eliezer Yudkowsky Facts post and comments? Yeesh, newcomers these days...