“Your decision theory should (almost always) be the same, whether you suppose that there is a 90% probability of something happening, or if it will happen in 9 out of 10 worlds. ”
I STRONGLY disagree here. If you suppose there is a 90% probability of something happening this usually means that you haven’t updated your priors enough to recognize that it actually happens in approximately 100% of worlds, and less frequently (but sadly, probably not 9 times less frequently) that you haven’t updated enough to recognize that it actually almost never or outright never happens.
“Average utilitarianism suddenly looks a lot more attractive—you don’t need to worry about creating as many people as possible, because there are already plenty of people exploring person-space.”
If many worlds meant infinitely many people this claim would be quite plausible to me, but why should aggregation stop mattering just because there are bignum people?
“Your decision theory should (almost always) be the same, whether you suppose that there is a 90% probability of something happening, or if it will happen in 9 out of 10 worlds. ”
I STRONGLY disagree here. If you suppose there is a 90% probability of something happening this usually means that you haven’t updated your priors enough to recognize that it actually happens in approximately 100% of worlds, and less frequently (but sadly, probably not 9 times less frequently) that you haven’t updated enough to recognize that it actually almost never or outright never happens.
“Average utilitarianism suddenly looks a lot more attractive—you don’t need to worry about creating as many people as possible, because there are already plenty of people exploring person-space.”
If many worlds meant infinitely many people this claim would be quite plausible to me, but why should aggregation stop mattering just because there are bignum people?