The future is inherently uncertain, so if I were to take a decreased quality of life (no more driving) to better my chances of surviving to ASI, I had better have strong intuition that ASI would come.
Yeah, I definitely hear ya. I have these feelings too. But at the same time, I think it’s in violation of Shut Up and Multiply.
My main worry about decreasing my quality of life by not driving / decreasing risk is the alignment problem for AI. I can image a counterfactual world where AI is created but it is not aligned, where if I were to take heavy precautions I would suffer decreased quality of life many years just to die to a rouge agent.
I hear ya here too. It’s once of the main places that affects the conclusion, I think. My reasoning for expecting a post-singularity year to have positive utility is because it seems like a place where it’d make sense to adopt the opinion of the experts I quoted in the article. (And that it’s less depressing.)
Yeah, I definitely hear ya. I have these feelings too. But at the same time, I think it’s in violation of Shut Up and Multiply.
I hear ya here too. It’s once of the main places that affects the conclusion, I think. My reasoning for expecting a post-singularity year to have positive utility is because it seems like a place where it’d make sense to adopt the opinion of the experts I quoted in the article. (And that it’s less depressing.)