Given your actual reasons for wondering about the world economy in 2040 conditioned on there not having been an extinction/Singularity yet, the survivalist option is actually worth a small hedge bet. If you can go (or convince someone else to go) live in a very remote area, with sufficient skills and resources to continue working quietly on building an FAI if there’s a non-existential global catastrophe, that looks like it has a strongly positive expectation (since in those circumstances, the number of competing AI attempts will probably be few if any).
Now considering the Slump scenarios in which civilization stagnates but survives, it looks like there’s not much prospect of winding up with extra capital in that situation, relative to others; but the capital you acquire might go relatively farther.
I have to say that the fact you’re strongly considering these matters is a bit chilling. I’d be relieved if the reason were that you ascribed probability significantly greater than 1% to a Long Slump, but I suspect it’s because you worry humanity will run out of time in many of the other scenarios before FAI work is finished- reducing you to looking at the Black Swan possibilities within which the world might just be saved.
Given your actual reasons for wondering about the world economy in 2040 conditioned on there not having been an extinction/Singularity yet, the survivalist option is actually worth a small hedge bet. If you can go (or convince someone else to go) live in a very remote area, with sufficient skills and resources to continue working quietly on building an FAI if there’s a non-existential global catastrophe, that looks like it has a strongly positive expectation (since in those circumstances, the number of competing AI attempts will probably be few if any).
Now considering the Slump scenarios in which civilization stagnates but survives, it looks like there’s not much prospect of winding up with extra capital in that situation, relative to others; but the capital you acquire might go relatively farther.
I have to say that the fact you’re strongly considering these matters is a bit chilling. I’d be relieved if the reason were that you ascribed probability significantly greater than 1% to a Long Slump, but I suspect it’s because you worry humanity will run out of time in many of the other scenarios before FAI work is finished- reducing you to looking at the Black Swan possibilities within which the world might just be saved.