The salient difference for me is that the real one has maximal impact. Many actions in it can affect anyone in a lower world, but not vice versa. I’d like my decisions to have as much effect as possible as a general principle, I think (not the only principle, but the one that dominates in this scenario).
This is pretty much why I’m comfortable calling being in the highest possible world a terminal value—it’s really not about the simulation (wouldn’t be bothered if it turned out current world is a simulation, although I’d like to go higher), not especially about separation, and only slightly about deception (certainly losing all impactfulness becomes more irreversible if the AI is lying).
My own view: -Separation is very, very bad. I’d be somewhat OK with reality becoming subjective but with some kind of interface between people but this whole scenario as stated is approaching the collapse civilization so we can’t FOOM level. My personal reaction to seeing this described as better than status quo was somewhat similar to playing Mass Effect and listening to Reapers talk about ‘salvation through destruction’ and ascension in the form of perpetually genocidal robo-squids. I mean seriously? ‘All your friends are actually p-zombies?’ Are you kidding me? /rant
Living in highest possible world for me is not a value but having access or interface or something to the highest possible world is. (Not particularly high.) But knowing the truth definitely is and having my friends actually be people also is. Would prefer just being separated from friends and given Verthandi (I.e. sentient people, but optimized) like in Failed Utopia 2-4.
The salient difference for me is that the real one has maximal impact. Many actions in it can affect anyone in a lower world, but not vice versa. I’d like my decisions to have as much effect as possible as a general principle, I think (not the only principle, but the one that dominates in this scenario).
This is pretty much why I’m comfortable calling being in the highest possible world a terminal value—it’s really not about the simulation (wouldn’t be bothered if it turned out current world is a simulation, although I’d like to go higher), not especially about separation, and only slightly about deception (certainly losing all impactfulness becomes more irreversible if the AI is lying).
Hmmmm..
My own view: -Separation is very, very bad. I’d be somewhat OK with reality becoming subjective but with some kind of interface between people but this whole scenario as stated is approaching the collapse civilization so we can’t FOOM level. My personal reaction to seeing this described as better than status quo was somewhat similar to playing Mass Effect and listening to Reapers talk about ‘salvation through destruction’ and ascension in the form of perpetually genocidal robo-squids. I mean seriously? ‘All your friends are actually p-zombies?’ Are you kidding me? /rant
Living in highest possible world for me is not a value but having access or interface or something to the highest possible world is. (Not particularly high.) But knowing the truth definitely is and having my friends actually be people also is. Would prefer just being separated from friends and given Verthandi (I.e. sentient people, but optimized) like in Failed Utopia 2-4.