FAI’s “top goal” is whatever it is that humans’ collective “top goal” is. It’s not at all clear that that necessarily includes the continued existence and welfare of humans.
Especially if you get picky about the definition of a human. It seems plausible that the humans of today turn into the uploads of tomorrow. I can envision a scenario in which there is continuity of consciousness, no one dies, most people are happy with the results, and there are no “humans” left by some definitions of the word.
Sure. You don’t even have to get absurdly picky; there are lots of scenarios like that in which there are no humans left by my definitions of the word, and I still consider that an improvement over the status quo.
FAI’s “top goal” is whatever it is that humans’ collective “top goal” is.
It’s not at all clear that that necessarily includes the continued existence and welfare of humans.
Especially if you get picky about the definition of a human. It seems plausible that the humans of today turn into the uploads of tomorrow. I can envision a scenario in which there is continuity of consciousness, no one dies, most people are happy with the results, and there are no “humans” left by some definitions of the word.
Sure. You don’t even have to get absurdly picky; there are lots of scenarios like that in which there are no humans left by my definitions of the word, and I still consider that an improvement over the status quo.