My real answer: we probably shouldn’t? Creating sentient life that has even slightly different morals seems like a very morally precarious thing to do without significant thought. (See the cheese post, can’t find it)
and you don’t get to program their DNA in advance?
Uh, why not?
Make humans that will predictably end up deciding not to colonize the galaxy or build superintelligences.
Creating sentient life that has even slightly different morals seems like a very morally precarious thing to do without significant thought.
I guess I’m more comfortable with procreation than you are :)
I imposed the “you don’t get to program their DNA in advance” constraint since it seems plausible to me that if you want to create a new colony of actual humans, you don’t have sufficient degrees of human to make them actually human-like but also docile enough.
You could imagine a similar task of “build a rather powerful AI system that is transparent and able to be monitored”, where perhaps ongoing supervision is required, but that’s not an onerous burden.
My real answer: we probably shouldn’t? Creating sentient life that has even slightly different morals seems like a very morally precarious thing to do without significant thought. (See the cheese post, can’t find it)
Uh, why not?
Make humans that will predictably end up deciding not to colonize the galaxy or build superintelligences.
I guess I’m more comfortable with procreation than you are :)
I imposed the “you don’t get to program their DNA in advance” constraint since it seems plausible to me that if you want to create a new colony of actual humans, you don’t have sufficient degrees of human to make them actually human-like but also docile enough.
You could imagine a similar task of “build a rather powerful AI system that is transparent and able to be monitored”, where perhaps ongoing supervision is required, but that’s not an onerous burden.