I think this essentially leads to SIA. Since you’re adding utilities over different copies of you, it follows that you care more about universes in which there are more copies of you.
Of course, it’s slightly different from SIA because SIA wants more copies of anyone, whether you or not. If the proportion of individuals who are you remains constant, then SIA is equivalent.
Elsewhere in my essay, I discuss a prudential argument (which I didn’t invent) for assuming there are lots of copies of you. Not sure if that’s the same as Armstrong’s proposal.
PSA is essentially favoring more copies of you per unit of spacetime / physics / computation / etc. -- as long as we understand “copy of you” to mean “instance of perceiving all the data you perceive right now” rather than just a copy of your body/brain but in a different environment.
Of course, it’s slightly different from SIA because SIA wants more copies of anyone, whether you or not. If the proportion of individuals who are you remains constant, then SIA is equivalent.
Elsewhere in my essay, I discuss a prudential argument (which I didn’t invent) for assuming there are lots of copies of you. Not sure if that’s the same as Armstrong’s proposal.
PSA is essentially favoring more copies of you per unit of spacetime / physics / computation / etc. -- as long as we understand “copy of you” to mean “instance of perceiving all the data you perceive right now” rather than just a copy of your body/brain but in a different environment.