I am not that confident about this. Or like, I don’t know, I do notice my psychological relationship to “all the stars explode” and “earth explodes” is very different, and I am not good enough at morality to be confident about dismissing that difference.
There’s definitely some difference, but I still think that the mathematical argument is just pretty strong, and losing a multiple of 1023 of your resources for hosting life and fun and goodness seems to me extremely close to “losing everything”.
@habryka I think you’re making a claim about whether or not the difference matters (IMO it does) but I perceived @Kaj_Sotala to be making a claim about whether “an average reasonably smart person out in society” would see the difference as meaningful (IMO they would not).
(My guess is you interpreted “reasonable people” to mean like “people who are really into reasoning about the world and trying to figure out the truth” and Kaj interpreted reasonable people to mean like “an average person.” Kaj should feel free to correct me if I’m wrong.)
The details matter here! Sometimes when (MIRI?) people say “unaligned AIs might be a bit nice and may not literally kill everyone” the modal story in their heads is something like some brain states of humans are saved in a hard drive somewhere for trade with more competent aliens. And sometimes when other people [1]say “unaligned humans might be a bit nice and may not literally kill everyone” the modal story in their heads is that some X% of humanity may or may not die in a violent coup, but the remaining humans get to live their normal lives on Earth (or even a solar system or two), with some AI survelliance but our subjective quality of life might not even be much worse (and might actually be better).
From a longtermist perspective, or a “dignity of human civilization” perspective, maybe the stories are pretty similar. But I expect “the average person” to be much more alarmed by the first story than the second, and not necessarily for bad reasons.
I don’t want to speak for Ryan or Paul, but at least tentatively this is my position: I basically think the difference from a resource management perspective of whether to keep humans around physically vs copies of them saved is ~0 when you have the cosmic endowment to play with, so small idiosyncratic preferences that’s significant enough to want to save human brain states should also be enough to be okay with keeping humans physically around; especially if humans strongly express a preference for the latter happening (which I think they do).
IMO this is an utter loss scenario, to be clear.
I am not that confident about this. Or like, I don’t know, I do notice my psychological relationship to “all the stars explode” and “earth explodes” is very different, and I am not good enough at morality to be confident about dismissing that difference.
There’s definitely some difference, but I still think that the mathematical argument is just pretty strong, and losing a multiple of 1023 of your resources for hosting life and fun and goodness seems to me extremely close to “losing everything”.
@habryka I think you’re making a claim about whether or not the difference matters (IMO it does) but I perceived @Kaj_Sotala to be making a claim about whether “an average reasonably smart person out in society” would see the difference as meaningful (IMO they would not).
(My guess is you interpreted “reasonable people” to mean like “people who are really into reasoning about the world and trying to figure out the truth” and Kaj interpreted reasonable people to mean like “an average person.” Kaj should feel free to correct me if I’m wrong.)
The details matter here! Sometimes when (MIRI?) people say “unaligned AIs might be a bit nice and may not literally kill everyone” the modal story in their heads is something like some brain states of humans are saved in a hard drive somewhere for trade with more competent aliens. And sometimes when other people [1]say “unaligned humans might be a bit nice and may not literally kill everyone” the modal story in their heads is that some X% of humanity may or may not die in a violent coup, but the remaining humans get to live their normal lives on Earth (or even a solar system or two), with some AI survelliance but our subjective quality of life might not even be much worse (and might actually be better).
From a longtermist perspective, or a “dignity of human civilization” perspective, maybe the stories are pretty similar. But I expect “the average person” to be much more alarmed by the first story than the second, and not necessarily for bad reasons.
I don’t want to speak for Ryan or Paul, but at least tentatively this is my position: I basically think the difference from a resource management perspective of whether to keep humans around physically vs copies of them saved is ~0 when you have the cosmic endowment to play with, so small idiosyncratic preferences that’s significant enough to want to save human brain states should also be enough to be okay with keeping humans physically around; especially if humans strongly express a preference for the latter happening (which I think they do).