Surviving AGI and it solving aging doesn’t imply you specifically get to live forever. In my mental model, safe AGI simply entails mass disempowerment, because how do you earn an income when everything you do is done better and cheaper by an AI? If the answer is UBI, what power do people have to argue for it and politically keep it?
One common view is that most of the arguing about politics will be done by AI once AI is powerful enough to surpass human capabilities by a lot. The values that are programmed into the AI matter a lot.
I count “You don’t get a feasibly obtainable source of resources needed to survive” as an instance in the “Death from AGI” category. Of course, death from AGI is not mutually exclusive with AGI having immortality tech.
Surviving AGI and it solving aging doesn’t imply you specifically get to live forever. In my mental model, safe AGI simply entails mass disempowerment, because how do you earn an income when everything you do is done better and cheaper by an AI? If the answer is UBI, what power do people have to argue for it and politically keep it?
One common view is that most of the arguing about politics will be done by AI once AI is powerful enough to surpass human capabilities by a lot. The values that are programmed into the AI matter a lot.
I count “You don’t get a feasibly obtainable source of resources needed to survive” as an instance in the “Death from AGI” category. Of course, death from AGI is not mutually exclusive with AGI having immortality tech.