The idea is that a model of the world that helps you succeed inside the box might naturally generalize to making consequentialist plans that depend on whether you’re in the box. This is actually closely analogous to human intellect—we evolved our reasoning capabilities because they helped use reproduce in hunter-gatherer groups, but since then we’ve used our brains for all sorts of new things that evolution totally didn’t predict. And when we are placed in the modern environment rather than hunter-gatherer groups, we actually use our brains to invent condoms and otherwise deviate from what evolution originally thought brains were good for.
The idea is that a model of the world that helps you succeed inside the box might naturally generalize to making consequentialist plans that depend on whether you’re in the box. This is actually closely analogous to human intellect—we evolved our reasoning capabilities because they helped use reproduce in hunter-gatherer groups, but since then we’ve used our brains for all sorts of new things that evolution totally didn’t predict. And when we are placed in the modern environment rather than hunter-gatherer groups, we actually use our brains to invent condoms and otherwise deviate from what evolution originally thought brains were good for.