I think the shorter the timeline, the more specific your plan and actions need to be. For short (< 10 year, up to maybe 40 year with very high confidence) timelines for radical singularity-like disruption, you aren’t talking about “optimizing”, but “preparing for” or “reacting to” the likely scenarios.
It’s the milder disruptions, or longer timelines for radical changes, that are problematic in this case. What have you given up in working to make the short-timeline more pleasant/survivable that you will be sad about if the world doesn’t end?
Having kids and how much energy to invest in them (including before you have them, in earning money you don’t donate, and in otherwise preparing your life) rather than in AIpocalypse preparedness is probably the biggest single decision related to this prediction.
I think the shorter the timeline, the more specific your plan and actions need to be. For short (< 10 year, up to maybe 40 year with very high confidence) timelines for radical singularity-like disruption, you aren’t talking about “optimizing”, but “preparing for” or “reacting to” the likely scenarios.
It’s the milder disruptions, or longer timelines for radical changes, that are problematic in this case. What have you given up in working to make the short-timeline more pleasant/survivable that you will be sad about if the world doesn’t end?
Having kids and how much energy to invest in them (including before you have them, in earning money you don’t donate, and in otherwise preparing your life) rather than in AIpocalypse preparedness is probably the biggest single decision related to this prediction.