Pretty much, yes. Total loss of power and value is pretty much slow/delayed extinction. It’s certainly cultural extinction.
Note that I forgot to say that I put some weight/comfort in thinking there are some parts of mindspace which an AI could include, which are nearly as good (or maybe better) than biologicals. Once everyone I know and everyone THEY know are dead, and anything I recognize as virtues are mutated beyond my recognition, it’s not clear what preferences I would have about the ongoing civilization. Maybe extinction is an acceptible outcome.
What does “value” mean here? I seriously don’t know what you mean by “total loss of value”. Is this tied to your use of “economically important”?
I personally don’t give a damn for anybody else depending on me as the source of anything they value, at least not with respect to anything that’s traditionally spoken of as “economic”. In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
And power over what? Most people right this minute have no significant power over the wide-scale course of anything.
I thought “extinction”, whether for a species or a culture, had a pretty clear meaning: It doesn’t exist any more. I can’t see how that’s connected to anything you’re talking about.
I do agree with you about human extinction not necessarily being the end of the world, depending on how it happens and what comes afterwards… but I can’t see how loss of control, or value, or whatever, is connected to anything that fits the word “extinction”. Not physical, not cultural, not any kind.
“value” means “net positive to the beings making decisions that impact me”. Humans claim to and behave as if they care about other humans, even when those other humans are distant statistical entities, not personally-known.
The replacement consciousnesses will almost certainly not feel the same way about “legacy beings”, and to the extent they preserve some humans, it won’t be because they care about them as people, it’ll be for more pragmatic purposes. And this is a very fragile thing, unlikely to last more than a few thousand years.
In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
Sure, but they can’t, and you can’t. They can only get what other humans give/trade/allow to them, and you are in the same boat. “whatever you want” includes limited exclusive-use resources, and if it’s more valuable (overall, for the utility functions of whatever’s making the decisions) to eliminate you than to share those resources, you’ll be eliminated.
Do you really mean to indicate that not running everything is equivalent to extinction?
Pretty much, yes. Total loss of power and value is pretty much slow/delayed extinction. It’s certainly cultural extinction.
Note that I forgot to say that I put some weight/comfort in thinking there are some parts of mindspace which an AI could include, which are nearly as good (or maybe better) than biologicals. Once everyone I know and everyone THEY know are dead, and anything I recognize as virtues are mutated beyond my recognition, it’s not clear what preferences I would have about the ongoing civilization. Maybe extinction is an acceptible outcome.
What does “value” mean here? I seriously don’t know what you mean by “total loss of value”. Is this tied to your use of “economically important”?
I personally don’t give a damn for anybody else depending on me as the source of anything they value, at least not with respect to anything that’s traditionally spoken of as “economic”. In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
And power over what? Most people right this minute have no significant power over the wide-scale course of anything.
I thought “extinction”, whether for a species or a culture, had a pretty clear meaning: It doesn’t exist any more. I can’t see how that’s connected to anything you’re talking about.
I do agree with you about human extinction not necessarily being the end of the world, depending on how it happens and what comes afterwards… but I can’t see how loss of control, or value, or whatever, is connected to anything that fits the word “extinction”. Not physical, not cultural, not any kind.
“value” means “net positive to the beings making decisions that impact me”. Humans claim to and behave as if they care about other humans, even when those other humans are distant statistical entities, not personally-known.
The replacement consciousnesses will almost certainly not feel the same way about “legacy beings”, and to the extent they preserve some humans, it won’t be because they care about them as people, it’ll be for more pragmatic purposes. And this is a very fragile thing, unlikely to last more than a few thousand years.
Sure, but they can’t, and you can’t. They can only get what other humans give/trade/allow to them, and you are in the same boat. “whatever you want” includes limited exclusive-use resources, and if it’s more valuable (overall, for the utility functions of whatever’s making the decisions) to eliminate you than to share those resources, you’ll be eliminated.