Just… don’t put it in a world where it should be able to upgrade infinitely? Make processors cost unobtainium and limit the amount of unobtainium so it can’t upgrade past your practical processing capacity.
Remember that we are the ones who control how the box looks from inside.
Remember that you have to get this right the first time; if the AI finds itself in a box, you have to assume it will find its way out.
Minor nitpick: if the AI finds itself in a box, I have to assume it will be let out. It’s completely trivial to prevent it from escaping when not given help; the point in Eliezer’s experiment is that the AI will be given help.
Just… don’t put it in a world where it should be able to upgrade infinitely? Make processors cost unobtainium and limit the amount of unobtainium so it can’t upgrade past your practical processing capacity.
Remember that we are the ones who control how the box looks from inside.
Minor nitpick: if the AI finds itself in a box, I have to assume it will be let out. It’s completely trivial to prevent it from escaping when not given help; the point in Eliezer’s experiment is that the AI will be given help.
Note that this makes global processing power being limited evidence that the universe is a box.
Good point.
The strength of the evidence depends a lot on your prior for the root-level universe, though.