Coming of Age sequence examined realization of this error from Eliezer’s standpoint, and has further links.
In which post? I’m not finding discussion about the supposed danger of improved humanish AGI.
That Tiny Note of Discord, say. (Not on “humanish” AGI, but eventually exploding AGI.)
I don’t see much of a relation at all to what i’ve been discussing in that first post.
[http://lesswrong.com/lw/lq/fake_utility_functions/] is a little closer, but still doesn’t deal with human-ish AGI.
Coming of Age sequence examined realization of this error from Eliezer’s standpoint, and has further links.
In which post? I’m not finding discussion about the supposed danger of improved humanish AGI.
That Tiny Note of Discord, say. (Not on “humanish” AGI, but eventually exploding AGI.)
I don’t see much of a relation at all to what i’ve been discussing in that first post.
[http://lesswrong.com/lw/lq/fake_utility_functions/] is a little closer, but still doesn’t deal with human-ish AGI.