I was looking at this image in a post and it gave me some (loosely connected/ADD-type) thoughts.
In order:
The entities outside the box look pretty scary.
I think I would get over that quickly, they’re just different evolved body shapes. The humans could seem scary-looking from their pov too.
Wait.. but why would the robots have those big spiky teeth? (implicit question: what narratively coherent world could this depict?)
Do these forms have qualities associated with predator species, and that’s why they feel scary? (Is this a predator-species-world?)
Most humans are also predators in a non-strict sense.
I don’t want to live in a world where there’s only the final survivors of selection processes who shrug indifferently when asked why we don’t revive all the beings who were killed in the process which created the final survivors. (implicit: related to how a ‘predator-species-world’ from (4) could exist)
There’s been many occasions where I’ve noticed what feels like a more general version of that attitude in a type of current human, but I don’t know how to describe it.
I don’t want to live in a world where there’s only the final survivors of selection processes who shrug indifferently when asked why we don’t revive all the beings who were killed in the process which created the final survivors.
If you could revive all the victims of the selection process that brought us to the current state, all the crusaders and monarchists and vikings and Maoists and so, so many illiterate peasant farmers (on much too little land because you’ve got hundreds of generations of them at once, mostly with ideas that make Putin look like Sonia Sotomayor), would you? They’d probably make quite the mess. Bringing them back would probably restart the selection process and we probably wouldn’t be selected again. It just seems like a terrible idea to me.
I’m thinking of this in the context of a post-singularity future, where we wouldn’t need to worry about things like conflict or selection processes.
By ‘the ones who were killed in the process’, I was thinking about e.g herbivorous animals that were killed by predator species[1], but you’re correct that it could include humans too. A lot of humans have been unjustly killed (by others or by nature) throughout history.
I think my endorsed morals are indifferent about the (dis)value of reviving abusive minds from the past, though moral-patient-me dislikes the idea on an intuitive level, and wishes for a better narrative ending than that.
(Also I upvoted your comment from negative)
I also notice some implied hard moral questions (What of current mean-hearted people? What about the potential for past ones of them to have changed into good people? etc)
As a clear example of a kind of being who seems innocent of wrongdoing. Not ruling out other cases, e.g plausibly inside the mind of the cat that I once witnessed killing a bunny, there could be total naivety about what was even being done.
Sort-of relatedly, I basically view evolution as having favored the dominance of agents with defect-y decision-making, even though the equilibrium of ‘collaborating with each other to harness the free energy of the sun’ would have been so much better. (Maybe another reason that didn’t happen is that there would be less of a gradual buildup of harder and harder training environments, in that case)
(In case worrying about those is something you’d find fun, then you could choose to experience contexts where you still would, like complex game/fantasy worlds.)
To be more precise: extrapolated over time, for any undesired selection process or other problem of that kind, either the problem is large enough that it gets exarcerbated over time so much that it eats everything — and then that’s just extinction, but slower — or it’s not large enough to win out and aligned superintelligence(s) + coordinated human action is enough to stamp it out in the long run, which means they won’t be an issue for almost all of the future.
It seems like for a problem to be just large enough that coordination doesn’t stamp it away, but also it doesn’t eat everything, would be a very fragile equilibrium, and I think that’s pretty unlikely.
I was looking at this image in a post and it gave me some (loosely connected/ADD-type) thoughts.
In order:
The entities outside the box look pretty scary.
I think I would get over that quickly, they’re just different evolved body shapes. The humans could seem scary-looking from their pov too.
Wait.. but why would the robots have those big spiky teeth? (implicit question: what narratively coherent world could this depict?)
Do these forms have qualities associated with predator species, and that’s why they feel scary? (Is this a predator-species-world?)
Most humans are also predators in a non-strict sense.
I don’t want to live in a world where there’s only the final survivors of selection processes who shrug indifferently when asked why we don’t revive all the beings who were killed in the process which created the final survivors. (implicit: related to how a ‘predator-species-world’ from (4) could exist)
There’s been many occasions where I’ve noticed what feels like a more general version of that attitude in a type of current human, but I don’t know how to describe it.
(I mostly ignored the humans-are-in-a-box part.)
I don’t want to live in a world where there’s only the final survivors of selection processes who shrug indifferently when asked why we don’t revive all the beings who were killed in the process which created the final survivors.
If you could revive all the victims of the selection process that brought us to the current state, all the crusaders and monarchists and vikings and Maoists and so, so many illiterate peasant farmers (on much too little land because you’ve got hundreds of generations of them at once, mostly with ideas that make Putin look like Sonia Sotomayor), would you? They’d probably make quite the mess. Bringing them back would probably restart the selection process and we probably wouldn’t be selected again. It just seems like a terrible idea to me.
Some clarifications:
I’m thinking of this in the context of a post-singularity future, where we wouldn’t need to worry about things like conflict or selection processes.
By ‘the ones who were killed in the process’, I was thinking about e.g herbivorous animals that were killed by predator species[1], but you’re correct that it could include humans too. A lot of humans have been unjustly killed (by others or by nature) throughout history.
I think my endorsed morals are indifferent about the (dis)value of reviving abusive minds from the past, though moral-patient-me dislikes the idea on an intuitive level, and wishes for a better narrative ending than that.
(Also I upvoted your comment from negative)
I also notice some implied hard moral questions (What of current mean-hearted people? What about the potential for past ones of them to have changed into good people? etc)
As a clear example of a kind of being who seems innocent of wrongdoing. Not ruling out other cases, e.g plausibly inside the mind of the cat that I once witnessed killing a bunny, there could be total naivety about what was even being done.
Sort-of relatedly, I basically view evolution as having favored the dominance of agents with defect-y decision-making, even though the equilibrium of ‘collaborating with each other to harness the free energy of the sun’ would have been so much better. (Maybe another reason that didn’t happen is that there would be less of a gradual buildup of harder and harder training environments, in that case)
I’m curious why you seem to think we don’t need to worry about things like conflict or selection processes post-singularity.
Because a benevolent ASI would make everything okay.
(In case worrying about those is something you’d find fun, then you could choose to experience contexts where you still would, like complex game/fantasy worlds.)
To be more precise: extrapolated over time, for any undesired selection process or other problem of that kind, either the problem is large enough that it gets exarcerbated over time so much that it eats everything — and then that’s just extinction, but slower — or it’s not large enough to win out and aligned superintelligence(s) + coordinated human action is enough to stamp it out in the long run, which means they won’t be an issue for almost all of the future.
It seems like for a problem to be just large enough that coordination doesn’t stamp it away, but also it doesn’t eat everything, would be a very fragile equilibrium, and I think that’s pretty unlikely.