“I wish for this wish to have no further effect beyond this utterance.”
Overwhelmingly probable dire consequence: You and everyone you love dies (over a period of 70 years) then, eventually, your entire species goes extinct. But hey, at least it’s not “your fault”.
But, alas, it’s the wish that maximizes my expected utility—for the malicious genie, anyway.
Possibly. I don’t off hand see what a malicious genie could do about that statement. However it does at least require it to honor a certain interpretation of your words as well as your philosophy about causality—in particular accept a certain idea of what the ‘default’ is relative to which ‘no effect’ can have meaning. There is enough flexibility in how to interpret your wish that I begin to suspect that conditional on the genie being sufficiently amiable and constrained that it gives you what you want in response to this wish there is likely to be possible to construct another wish that has no side effects beyond something that you can exploit as a fungible resource.
“No effect” is a whole heap more complicated and ambiguous than it looks!
You can’t? So much the worse for your species. I quite possibly couldn’t either. I’d probably at least think about it for five minutes first. I may even make a phone call first. And if I and my advisers conclude that for some bizarre reason “no further effect beyond this utterance” is better than any other simple wish that is an incremental improvement then I may end up settling for it. But I’m not going to pretend that I have found some sort of way to wash my hands of responsibility.
Meanwhile, you have ⇐ 70 years to solve it another way.
Yes, that’s better than catastrophic instant death of my species. And if I happen to estimate that my species has 90% chance of extinction within a couple of hundred years then I would be making the choice to accept a 90% chance of that doom. I haven’t cleverly tricked my way out of a moral conundrum, I have made a gamble with the universe at stake, for better or for worse.
That would just cause them to pump chemicals in you head, I think. But it’s definitely thinking in the right direction.
“number of dying humans” is really what you want to minimize.
Even with pseudo immortality, accidents happen, which means that the best way to minimize the number of dying humans is either to sterilize the entire species or to kill everyone. The goal shouldn’t be to minimize death but to maximize life.
Overwrite my current utility function upon your previous motivational networks, leaving no motivational trace of their remains.
That would just cause them to pump chemicals in you head, I think. But it’s definitely thinking in the right direction.
As long as I am not aware of that (or do not dislike it)… well, why not. However, MugaSofer is right, the genie has to understand the (future) utility function for that. But if it can alter the future without restrictions, it can change the utility function itself (maybe even to an unbounded one… :D)
Overwhelmingly probable dire consequence: You and everyone you love dies (over a period of 70 years) then, eventually, your entire species goes extinct. But hey, at least it’s not “your fault”.
But, alas, it’s the wish that maximizes my expected utility—for the malicious genie, anyway.
Possibly. I don’t off hand see what a malicious genie could do about that statement. However it does at least require it to honor a certain interpretation of your words as well as your philosophy about causality—in particular accept a certain idea of what the ‘default’ is relative to which ‘no effect’ can have meaning. There is enough flexibility in how to interpret your wish that I begin to suspect that conditional on the genie being sufficiently amiable and constrained that it gives you what you want in response to this wish there is likely to be possible to construct another wish that has no side effects beyond something that you can exploit as a fungible resource.
“No effect” is a whole heap more complicated and ambiguous than it looks!
You can’t use that tool to solve that problem.
Meanwhile, you have ⇐ 70 years to solve it another way.
You can’t? So much the worse for your species. I quite possibly couldn’t either. I’d probably at least think about it for five minutes first. I may even make a phone call first. And if I and my advisers conclude that for some bizarre reason “no further effect beyond this utterance” is better than any other simple wish that is an incremental improvement then I may end up settling for it. But I’m not going to pretend that I have found some sort of way to wash my hands of responsibility.
Yes, that’s better than catastrophic instant death of my species. And if I happen to estimate that my species has 90% chance of extinction within a couple of hundred years then I would be making the choice to accept a 90% chance of that doom. I haven’t cleverly tricked my way out of a moral conundrum, I have made a gamble with the universe at stake, for better or for worse.
Relevant reading: The Parable of the Talents.
“I wish for all humans to be immortal.”
Sure, you need to start heavily promoting birth control, and there can be problems depending on how you define “immortal”, but …
It’s a wish. You can wish for anything.
Unless, I suppose, that would have been your first wish. But the OP basically says your first wish was an FAI.
Immortal humans can go horribly wrong, unless “number of dying humans” is really what you want to minimize.
“Increase my utility as much as you can”?
I said:
You replied:
I am well aware that this wish has major risks as worded. I was responding to the claim that “you can’t use that tool to solve that problem.”
Yes, obviously you wish for maximised utility. But that requires the genie to understand your utility.
That would just cause them to pump chemicals in you head, I think. But it’s definitely thinking in the right direction.
Even with pseudo immortality, accidents happen, which means that the best way to minimize the number of dying humans is either to sterilize the entire species or to kill everyone. The goal shouldn’t be to minimize death but to maximize life.
That actually seems like it’d work.
It wouldn’t do that (except in some sense in which it is able to do arbitrary things you don’t mean when given complicated or undefined requests).
As long as I am not aware of that (or do not dislike it)… well, why not. However, MugaSofer is right, the genie has to understand the (future) utility function for that. But if it can alter the future without restrictions, it can change the utility function itself (maybe even to an unbounded one… :D)