Really x-risk mitigation is for dessert, though. There are so many failures of our society that are lower-hanging in terms of how rational you have to be to see them.
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”.
I agree totally that FAI is the single most important problem, but it is hard to see that. As far as I know, Bostrom/Eliezer noticed it as a problem about 10-13 years ago, before which not one person in the world had seen it as a problem. If our world had at least solved the easier-to-see problems, then one would have some confidence that FAI would at least be considered.
For deleted context of parent please refer to grand-aunt. (Courtesy of a bizarre bug somewhere in which every comment and PM reply of mine was being posted twice and, evidently, fast response by Roko.)
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”.
I agree totally that FAI is the single most important problem, but it is hard to see that. As far as I know, Bostrom/Eliezer noticed it as a problem about 10-13 years ago, before which not one person in the world had seen it as a problem. If our world had at least solved the easier-to-see problems, then one would have some confidence that FAI would at least be considered.
For deleted context of parent please refer to grand-aunt. (Courtesy of a bizarre bug somewhere in which every comment and PM reply of mine was being posted twice and, evidently, fast response by Roko.)
ETA: And I agree with parent.