Jiro’s response shows one good reason why I don’t find that thought experiment very interesting. Another obvious reason is its extreme implausibility and, I strongly suspect, actual incoherence (given what we know about physics and biology). I think I can safely say “I have no idea what I would prefer”, much like Eliezer finds no reason to answer how he would explain his arm being turned into a blue tentacle, and not have that be counted against me.
On to FAI theory:
Would a friendly AI discount the welfare of “weaker” beings as you and me (compared to this hyper-agent) lexically? Could that possibly be a fAI?
By definition, it would not, because if it did, then it would be an Unfriendly AI.
If not, then I think we should also rethink our moral behaviour towards weaker beings in our game here for our decisions can result in bad things for them correspondingly.
How do you get from facts about the behavior of an FAI to claims about how we should act? I spy one of those pesky “is-ought” transitions that bedeviled Hume!
Corollary: why should we care that our behavior results in bad things for animals? Isn’t that the question in the first place, and doesn’t your statement beg said question?
Jiro’s response shows one good reason why I don’t find that thought experiment very interesting. Another obvious reason is its extreme implausibility and, I strongly suspect, actual incoherence (given what we know about physics and biology). I think I can safely say “I have no idea what I would prefer”, much like Eliezer finds no reason to answer how he would explain his arm being turned into a blue tentacle, and not have that be counted against me.
On to FAI theory:
By definition, it would not, because if it did, then it would be an Unfriendly AI.
How do you get from facts about the behavior of an FAI to claims about how we should act? I spy one of those pesky “is-ought” transitions that bedeviled Hume!
Corollary: why should we care that our behavior results in bad things for animals? Isn’t that the question in the first place, and doesn’t your statement beg said question?