Oh sure—I don’t mean to imply there’s no upside in this framing, or that I don’t see a downside in Eliezer’s.
However, whether you know of outs depends on what you see as an out. E.g. buying much more time to come up with a solution could be seen as an out by some people. It’s easy to imagine many bad plans to do that, with potentially hugely negative side-effects.
Some of those bad plans would look rational, conditional on an assumption that there was no other way to avoid losing the future. Of course making such an assumption is poor reasoning, but the trouble is that it happens implicitly: nobody needs to say to themselves ”...and here I assume that no-one on earth has or will come up with approaches I’ve missed”, they only need to fail to ask themselves the right questions.
Conditional on being very clear on not knowing the outs, I think this framing may well be a good one for many people—but I’m serious about the mental exercise.
Oh sure—I don’t mean to imply there’s no upside in this framing, or that I don’t see a downside in Eliezer’s.
However, whether you know of outs depends on what you see as an out. E.g. buying much more time to come up with a solution could be seen as an out by some people. It’s easy to imagine many bad plans to do that, with potentially hugely negative side-effects.
Some of those bad plans would look rational, conditional on an assumption that there was no other way to avoid losing the future. Of course making such an assumption is poor reasoning, but the trouble is that it happens implicitly: nobody needs to say to themselves ”...and here I assume that no-one on earth has or will come up with approaches I’ve missed”, they only need to fail to ask themselves the right questions.
Conditional on being very clear on not knowing the outs, I think this framing may well be a good one for many people—but I’m serious about the mental exercise.