No value at all: to answer “how valuable do you think creating extra identical, non-interacting copies of yourself is? (each copy existing in its own computational world, which is identical to yours with no copy-copy or world-world interaction)”
Existence of worlds that are not causally related to me should not influence my decisions (I learn from the past and I teach the future: my world cone is my responsibility). I decide by considering whether the world that I create/allow my copy (or child) to exist in is better off (according to myself—my “better” is my best approximation to the “objective better” if there’s any) because of the copy or not. I do not even have a causal voice on the shape of the world in question, it is already postulated in a fixed way. Even more strongly, as far as it is wasted computation in some other world, its value is negative. (If the results of this simulation are “consumed by someone”, it needs not be “wasted”.)
Disclaimer: I haven’t followed the discussion too closely.
No value at all: to answer “how valuable do you think creating extra identical, non-interacting copies of yourself is? (each copy existing in its own computational world, which is identical to yours with no copy-copy or world-world interaction)”
Existence of worlds that are not causally related to me should not influence my decisions (I learn from the past and I teach the future: my world cone is my responsibility). I decide by considering whether the world that I create/allow my copy (or child) to exist in is better off (according to myself—my “better” is my best approximation to the “objective better” if there’s any) because of the copy or not. I do not even have a causal voice on the shape of the world in question, it is already postulated in a fixed way. Even more strongly, as far as it is wasted computation in some other world, its value is negative. (If the results of this simulation are “consumed by someone”, it needs not be “wasted”.)
Disclaimer: I haven’t followed the discussion too closely.