I don’t buy it. A camera that some robot is using to make decisions is no simpler than any other place on Earth, just more important.
(This already gives the importance-weighted predictor a benefit of ~log(quadrillion))
Clearly you need to e.g. make the anthropic update and do stuff like that before you have any chance of competing with the consequentialist. This might just be a quantitative difference about how simple is simple—like I said elsewhere, all the action is in the additive constants, I agree that the important things are “simple” in some sense.
I don’t buy it. A camera that some robot is using to make decisions is no simpler than any other place on Earth, just more important.
(This already gives the importance-weighted predictor a benefit of ~log(quadrillion))
Clearly you need to e.g. make the anthropic update and do stuff like that before you have any chance of competing with the consequentialist. This might just be a quantitative difference about how simple is simple—like I said elsewhere, all the action is in the additive constants, I agree that the important things are “simple” in some sense.
Ok, I see what you’re getting at now.