Are the copies running (in a different environment so that their state will diverge)? If so, I don’t understand why each copy beyond the original should have no value. Only if each copy runs within an identical environment (so there’s no new information in any additional copy given one) can I buy that there’s no value in additional copies (beyond redundancy, in case there’s some independent chance of destruction-from-outside).
“No intrinsic value in the number of copies”—perhaps I misread that, then? I admit I didn’t think through the implications of the axioms along with you, since I felt like the first was questionable.
It means that I don’t derive extra utility from having specifically one, three or 77 copies. So I don’t say “hey, I have three copies, adding one more would be a tragedy! I don’t want to have four copies—four is an unlucky number.”
It doesn’t mean that I don’t derive extra utility from having many copies and all of them being happy.
Maybe you mean “(my) utility as a function of how many copies (of ‘me’) there are (all in happy-enough situations) is [strictly] monotone”. Otherwise I don’t follow. This “special numbers with intrinsic value” concept is cumbersome.
I don’t like it either, and it may not be needed. (and I don’t need the “strictly monotone”; that’s a conclusion of the the axioms). I’ll have to recast it all formally to check whether its needed.
Are the copies running (in a different environment so that their state will diverge)? If so, I don’t understand why each copy beyond the original should have no value. Only if each copy runs within an identical environment (so there’s no new information in any additional copy given one) can I buy that there’s no value in additional copies (beyond redundancy, in case there’s some independent chance of destruction-from-outside).
Er… This axiomatic setup implies that all copies have extra value.
“No intrinsic value in the number of copies”—perhaps I misread that, then? I admit I didn’t think through the implications of the axioms along with you, since I felt like the first was questionable.
It means that I don’t derive extra utility from having specifically one, three or 77 copies. So I don’t say “hey, I have three copies, adding one more would be a tragedy! I don’t want to have four copies—four is an unlucky number.”
It doesn’t mean that I don’t derive extra utility from having many copies and all of them being happy.
Maybe you mean “(my) utility as a function of how many copies (of ‘me’) there are (all in happy-enough situations) is [strictly] monotone”. Otherwise I don’t follow. This “special numbers with intrinsic value” concept is cumbersome.
I don’t like it either, and it may not be needed. (and I don’t need the “strictly monotone”; that’s a conclusion of the the axioms). I’ll have to recast it all formally to check whether its needed.