You’re right, I didn’t discuss the possibility of infinite numbers of branches, though as you suggest this leads to essentially the same conclusion as I reach in the case of finite branches, which is that it causes problems for consequentialist ethics (Joe Carlsmith’s Infinite Ethics is good on this). If what you mean by ‘normalize everything’ is to only consider the quantum weights (which are finite as mathematical measures) and not the number of worlds, then that seems more a case of ignoring those problems rather than addressing them.
I hope it was clear that I was suggesting a third approach (the number of worlds is neither finite, nor infinite, but indefinite) which does I think address the ethical problems better, since if there is no definite number of worlds then we have a reason to ignore the number of worlds and focus on the weights.
This third approach is based on the idea that ‘worlds’ are macroscopic, emergent phenomena created through decoherence (Wallace’s book contains a full mathematical treatment of this). This supports both the claim that the number of worlds is indefinite (since it depends on ultimately arbitrary mappings of macroscopic to microscopic states) and the claim that worlds are created through quantum processes (since they are macroscopically indistinguishable before decoherence occurs). My point in the post was that these two claims in combination can avoid the repugnant conclusion via the approach of focusing on the weights.
(And when it comes to the virtue-theoretic implications, I’ve again tried to follow a weight-based approach, and not make assumptions about whether worlds are created or revealed.)
Thanks for the Egan suggestion, yea I love his work though need to read Quarantine more fully. It seems like the most philosophically relevant bit might be the ending which of course is the source of Egan’s Law (it all adds up to normality). I also need to read his short story ‘Singleton’, which I gather is very relevant too.
Thanks for the interesting comments.
You’re right, I didn’t discuss the possibility of infinite numbers of branches, though as you suggest this leads to essentially the same conclusion as I reach in the case of finite branches, which is that it causes problems for consequentialist ethics (Joe Carlsmith’s Infinite Ethics is good on this). If what you mean by ‘normalize everything’ is to only consider the quantum weights (which are finite as mathematical measures) and not the number of worlds, then that seems more a case of ignoring those problems rather than addressing them.
I hope it was clear that I was suggesting a third approach (the number of worlds is neither finite, nor infinite, but indefinite) which does I think address the ethical problems better, since if there is no definite number of worlds then we have a reason to ignore the number of worlds and focus on the weights.
This third approach is based on the idea that ‘worlds’ are macroscopic, emergent phenomena created through decoherence (Wallace’s book contains a full mathematical treatment of this). This supports both the claim that the number of worlds is indefinite (since it depends on ultimately arbitrary mappings of macroscopic to microscopic states) and the claim that worlds are created through quantum processes (since they are macroscopically indistinguishable before decoherence occurs). My point in the post was that these two claims in combination can avoid the repugnant conclusion via the approach of focusing on the weights.
(And when it comes to the virtue-theoretic implications, I’ve again tried to follow a weight-based approach, and not make assumptions about whether worlds are created or revealed.)
Thanks for the Egan suggestion, yea I love his work though need to read Quarantine more fully. It seems like the most philosophically relevant bit might be the ending which of course is the source of Egan’s Law (it all adds up to normality). I also need to read his short story ‘Singleton’, which I gather is very relevant too.