Yes. You included a lot of disclaimers and they seem to be sufficient.
According to my preferences there are already more humans around than desirable, at least until we have settled a few more galaxies. Which emphasizes just how important the no externalities clause was to my judgement. Even the externality of diluting the neg-entropy in the cosmic commons slightly further would make the creation a bad thing.
I don’t share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output. If they are identical copies having identical experiences then they mean little more than having a backup available. If some are getting tortured then the overall output of the relevant computation really does suffer (in the ‘get slightly worse’ sense although I suppose it is literal too).
Also, I would hesitate to torture copies of other people, on the grounds that there’s a conflict of interest and I can’t trust myself to reason honestly. I might feel differently after I’d been using my own fork-slaves for a while.
It’s OK. I (lightheartedly) reckon my clone army could take out your clone army if it became necessary to defend myselves. I/we’d then have to figure out how to put ‘ourselfs’ back together again without merge conflicts once the mobilization was no longer required. That sounds like a tricky task, but it could be fun.
I don’t share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output.
I derive my intuitions from the analogy of a cpu-inefficient interpreted language. I don’t care about the 99% wasted cycles, except secondarily as a moderate inconvenience. I care about whether the job gets done.
Yes. You included a lot of disclaimers and they seem to be sufficient.
According to my preferences there are already more humans around than desirable, at least until we have settled a few more galaxies. Which emphasizes just how important the no externalities clause was to my judgement. Even the externality of diluting the neg-entropy in the cosmic commons slightly further would make the creation a bad thing.
I don’t share the same preference intuitions as you regarding self-clone-torture. I consider copies to be part of the output. If they are identical copies having identical experiences then they mean little more than having a backup available. If some are getting tortured then the overall output of the relevant computation really does suffer (in the ‘get slightly worse’ sense although I suppose it is literal too).
It’s OK. I (lightheartedly) reckon my clone army could take out your clone army if it became necessary to defend myselves. I/we’d then have to figure out how to put ‘ourselfs’ back together again without merge conflicts once the mobilization was no longer required. That sounds like a tricky task, but it could be fun.
I derive my intuitions from the analogy of a cpu-inefficient interpreted language. I don’t care about the 99% wasted cycles, except secondarily as a moderate inconvenience. I care about whether the job gets done.