From an instrumental viewpoint, I hope you plan to figure out how to make everyone sitting around on a higher level credibly precommit to not messing with the power plug on your experience machine, otherwise it probably won’t last very long. (Other than that, I see no problems with us not sharing some terminal values.)
I can’t do much about scenarios in which it is optimal to kill humans. We’re probably all screwed in such a case. “Kill some humans according to these criteria” is a much smaller target than vast swathes of futures that simply kill us all.
figure out how to make everyone sitting around on a higher level credibly precommit to not messing with the power plug
That’s MFAI’s job. Living on the “highest level” also has the same problem, you have to protect your region of the universe from anything that could “de-optimize” it, and FAI will (attempt to) make sure this doesn’t happen.
If believing you inhabit the highest level floats your boat be my guest, just don’t mess with the power plug on my experience machine.
From an instrumental viewpoint, I hope you plan to figure out how to make everyone sitting around on a higher level credibly precommit to not messing with the power plug on your experience machine, otherwise it probably won’t last very long. (Other than that, I see no problems with us not sharing some terminal values.)
I just have to ensure that the inequality (Amount of damage I cause if outside my experience machine>Cost of running my experience machine) holds.
Translating that back into English, I get “unplug me from the Matrix and I’ll do my best to help Skynet kill you all”.
Also that killing you outright isn’t optimal.
I can’t do much about scenarios in which it is optimal to kill humans. We’re probably all screwed in such a case. “Kill some humans according to these criteria” is a much smaller target than vast swathes of futures that simply kill us all.
That’s MFAI’s job. Living on the “highest level” also has the same problem, you have to protect your region of the universe from anything that could “de-optimize” it, and FAI will (attempt to) make sure this doesn’t happen.