I’m not as chill as all that, and I absolutely appreciate people worrying about those dimensions. But I do tend to act in day-to-day behavior (and believe, in the average sense (my probablistic belief range incudes a lot of scenarios, but the average and median are somewhat close together, which is probably a sign of improper heuristics)) as if it’ll all be mostly-normal. I recently turned down a very good job offer in NYC (and happily, later found a better one in Seattle), but I see the analogy, and kind of agree it’s a good one, but from the other side—even people who think they’d hate NYC are probably wrong—hedonic adaptation is amazingly strong. I’ll try to represent those you’re frustrated with.
There will absolutely be changes, many of which will be uncomfortable, and probably regress from my peak-preference. As long as it’s not extinction or effective-extinction (a few humans kept in zoos or the like, but economically unimportant to the actual intelligent agents shaping the future), it’ll be … OK. Not necessarily great compared to imaginary utopias, but far better than the worst outcomes. Almost certainly better than any ancient person could have expected.
Pretty much, yes. Total loss of power and value is pretty much slow/delayed extinction. It’s certainly cultural extinction.
Note that I forgot to say that I put some weight/comfort in thinking there are some parts of mindspace which an AI could include, which are nearly as good (or maybe better) than biologicals. Once everyone I know and everyone THEY know are dead, and anything I recognize as virtues are mutated beyond my recognition, it’s not clear what preferences I would have about the ongoing civilization. Maybe extinction is an acceptible outcome.
What does “value” mean here? I seriously don’t know what you mean by “total loss of value”. Is this tied to your use of “economically important”?
I personally don’t give a damn for anybody else depending on me as the source of anything they value, at least not with respect to anything that’s traditionally spoken of as “economic”. In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
And power over what? Most people right this minute have no significant power over the wide-scale course of anything.
I thought “extinction”, whether for a species or a culture, had a pretty clear meaning: It doesn’t exist any more. I can’t see how that’s connected to anything you’re talking about.
I do agree with you about human extinction not necessarily being the end of the world, depending on how it happens and what comes afterwards… but I can’t see how loss of control, or value, or whatever, is connected to anything that fits the word “extinction”. Not physical, not cultural, not any kind.
“value” means “net positive to the beings making decisions that impact me”. Humans claim to and behave as if they care about other humans, even when those other humans are distant statistical entities, not personally-known.
The replacement consciousnesses will almost certainly not feel the same way about “legacy beings”, and to the extent they preserve some humans, it won’t be because they care about them as people, it’ll be for more pragmatic purposes. And this is a very fragile thing, unlikely to last more than a few thousand years.
In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
Sure, but they can’t, and you can’t. They can only get what other humans give/trade/allow to them, and you are in the same boat. “whatever you want” includes limited exclusive-use resources, and if it’s more valuable (overall, for the utility functions of whatever’s making the decisions) to eliminate you than to share those resources, you’ll be eliminated.
I’m not as chill as all that, and I absolutely appreciate people worrying about those dimensions. But I do tend to act in day-to-day behavior (and believe, in the average sense (my probablistic belief range incudes a lot of scenarios, but the average and median are somewhat close together, which is probably a sign of improper heuristics)) as if it’ll all be mostly-normal. I recently turned down a very good job offer in NYC (and happily, later found a better one in Seattle), but I see the analogy, and kind of agree it’s a good one, but from the other side—even people who think they’d hate NYC are probably wrong—hedonic adaptation is amazingly strong. I’ll try to represent those you’re frustrated with.
There will absolutely be changes, many of which will be uncomfortable, and probably regress from my peak-preference. As long as it’s not extinction or effective-extinction (a few humans kept in zoos or the like, but economically unimportant to the actual intelligent agents shaping the future), it’ll be … OK. Not necessarily great compared to imaginary utopias, but far better than the worst outcomes. Almost certainly better than any ancient person could have expected.
Do you really mean to indicate that not running everything is equivalent to extinction?
Pretty much, yes. Total loss of power and value is pretty much slow/delayed extinction. It’s certainly cultural extinction.
Note that I forgot to say that I put some weight/comfort in thinking there are some parts of mindspace which an AI could include, which are nearly as good (or maybe better) than biologicals. Once everyone I know and everyone THEY know are dead, and anything I recognize as virtues are mutated beyond my recognition, it’s not clear what preferences I would have about the ongoing civilization. Maybe extinction is an acceptible outcome.
What does “value” mean here? I seriously don’t know what you mean by “total loss of value”. Is this tied to your use of “economically important”?
I personally don’t give a damn for anybody else depending on me as the source of anything they value, at least not with respect to anything that’s traditionally spoken of as “economic”. In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
And power over what? Most people right this minute have no significant power over the wide-scale course of anything.
I thought “extinction”, whether for a species or a culture, had a pretty clear meaning: It doesn’t exist any more. I can’t see how that’s connected to anything you’re talking about.
I do agree with you about human extinction not necessarily being the end of the world, depending on how it happens and what comes afterwards… but I can’t see how loss of control, or value, or whatever, is connected to anything that fits the word “extinction”. Not physical, not cultural, not any kind.
“value” means “net positive to the beings making decisions that impact me”. Humans claim to and behave as if they care about other humans, even when those other humans are distant statistical entities, not personally-known.
The replacement consciousnesses will almost certainly not feel the same way about “legacy beings”, and to the extent they preserve some humans, it won’t be because they care about them as people, it’ll be for more pragmatic purposes. And this is a very fragile thing, unlikely to last more than a few thousand years.
Sure, but they can’t, and you can’t. They can only get what other humans give/trade/allow to them, and you are in the same boat. “whatever you want” includes limited exclusive-use resources, and if it’s more valuable (overall, for the utility functions of whatever’s making the decisions) to eliminate you than to share those resources, you’ll be eliminated.