When I was a little kid, I had this weird delusion that if I told a lie, I’d be transported to an alternate universe in which it were true, with my memory changed as well (rendering the idea unfalsifiable). I scrupulously avoided inaccurate statements at all times for this very reason. So I’ve thought through this question with a child’s earnestness- and eventually concluded that the question is meaningless.
You need to put some bounds on the power for this to make sense. If I had true omnipotence, then I’d necessarily have omniscience as well.
If I had omniscience, that means that I would be simultaneously aware of every possible logical statement and every possible universe. My awareness of every possible universe would of course constitute a simulation of every possible universe.
If I take the position that I am morally responsible for what goes on in my simulated universes, then I’d have to use my powers to block myself from thinking about those universes which contained suffering. (My childhood idea was that if you magically altered a universe, the original simply ceased to exist and was replaced by a copy. Naturally the whole thing is a bit more horrifying when you look at it this way and you’d never use your power—so let’s assume the philosophy that you are only morally responsible for things you simulate and that alterations via miracle does not constitute deletion and overwriting)
I’d spend the rest of eternity, simulating every moment in every universes that weren’t razed by my threshold for optimality. I would, of course, be experiencing everything in all these universes, so you would see the reflection of my utility function therein, stretched to its maximum given the circumstances.
To be honest, part of my utility function wants to keep a copy of universe exactly as I left it before omnipotence, for sentimental reasons. The other part of me would worry that I had a responsibility not to simulate sub-optimal universes, regardless of my own selfish desires.
The trouble is that my peak possible utility function given omnipotence, may or may not be lower than my peak possible utility function when not given omnipotence. I’m not really sure—I think I’d like omnipotence, but it would be thoroughly disconcerting philosophically. Of course, my utility function will also cause me to accept omnipotence when offered, regardless of whether or not it would lower my peak possible utility...rejecting the offer would bring the lowest maximum utility of all.
But then, I don’t know if this omnipotent me resembles the actual me at all...it’s all very good when I approximate myself as an agent with utility functions, but in reality I’m a bundle of neurons which cannot meaningly control omnipotent output or contain omniscience, so this hypothetical situation is logically absurd.
So my answer to this question is going to have to be Mu, but if the question did make sense then my first “wish” would be omniscience to accompany my omnipotence while leaving my utility function intact, at which point my magic will no longer operate via wish fulfillment and my decisions would be infinitely wise.
I guess I’m back with the same issue raised by the intelligence explosion. Utility functions aren’t real models of how my brain works, so how can I ever be certain that my wish for omniscience is going to go the way I expect it to when I can’t even coherently phrase what I want the omnipotent power to do? You can’t effectively wield omnipotence power without omniscience.
When I was a little kid, I had this weird delusion that if I told a lie, I’d be transported to an alternate universe in which it were true, with my memory changed as well (rendering the idea unfalsifiable). I scrupulously avoided inaccurate statements at all times for this very reason. So I’ve thought through this question with a child’s earnestness- and eventually concluded that the question is meaningless.
You need to put some bounds on the power for this to make sense. If I had true omnipotence, then I’d necessarily have omniscience as well.
If I had omniscience, that means that I would be simultaneously aware of every possible logical statement and every possible universe. My awareness of every possible universe would of course constitute a simulation of every possible universe.
If I take the position that I am morally responsible for what goes on in my simulated universes, then I’d have to use my powers to block myself from thinking about those universes which contained suffering. (My childhood idea was that if you magically altered a universe, the original simply ceased to exist and was replaced by a copy. Naturally the whole thing is a bit more horrifying when you look at it this way and you’d never use your power—so let’s assume the philosophy that you are only morally responsible for things you simulate and that alterations via miracle does not constitute deletion and overwriting)
I’d spend the rest of eternity, simulating every moment in every universes that weren’t razed by my threshold for optimality. I would, of course, be experiencing everything in all these universes, so you would see the reflection of my utility function therein, stretched to its maximum given the circumstances.
To be honest, part of my utility function wants to keep a copy of universe exactly as I left it before omnipotence, for sentimental reasons. The other part of me would worry that I had a responsibility not to simulate sub-optimal universes, regardless of my own selfish desires.
The trouble is that my peak possible utility function given omnipotence, may or may not be lower than my peak possible utility function when not given omnipotence. I’m not really sure—I think I’d like omnipotence, but it would be thoroughly disconcerting philosophically. Of course, my utility function will also cause me to accept omnipotence when offered, regardless of whether or not it would lower my peak possible utility...rejecting the offer would bring the lowest maximum utility of all.
But then, I don’t know if this omnipotent me resembles the actual me at all...it’s all very good when I approximate myself as an agent with utility functions, but in reality I’m a bundle of neurons which cannot meaningly control omnipotent output or contain omniscience, so this hypothetical situation is logically absurd.
So my answer to this question is going to have to be Mu, but if the question did make sense then my first “wish” would be omniscience to accompany my omnipotence while leaving my utility function intact, at which point my magic will no longer operate via wish fulfillment and my decisions would be infinitely wise.
I guess I’m back with the same issue raised by the intelligence explosion. Utility functions aren’t real models of how my brain works, so how can I ever be certain that my wish for omniscience is going to go the way I expect it to when I can’t even coherently phrase what I want the omnipotent power to do? You can’t effectively wield omnipotence power without omniscience.