This is how Wallace defines it (he in turn defines macroscopically indistinguishable in terms of providing the same rewards). It’s his term in the axiomatic system he uses to get decision theory to work. There’s not much to argue about here?
His definition leads to contradiction with informal intuition that motivates consideration of macroscopical indistinguishability in the first place.
We should care about low-measure instances in proportion to the measure, just as in classical decision theory we care about low-probability instances in proportion to the probability.
Why? Wallace’s argument is just “you don’t care about some irrelevant microscopic differences, so let me write this assumption that is superficially related to that preference, and here—it implies the Born rule”. Given MWI, there is nothing wrong physically or rationally in valuing your instances equally whatever their measure is. Their thoughts and experiences don’t depend on measure the same way they don’t depend on thickness or mass of a computer implementing them. You can rationally not care about irrelevant microscopic differences and still care about number of your thin instances.
I’m not at all saying the experiences of a person in a low-weight world are less valuable than a person in a high-weight world. Just that when you are considering possible futures in a decision-theoretic framework you need to apply the weights (because weight is equivalent to probability).
Wallace’s useful achievement in this context is to show that there exists a set of axioms that makes this work, and this includes branch-indifference.
This is useful because makes clear the way in which the branch-counting approach you’re suggesting is in conflict with decision theory. So I don’t disagree that you can care about the number of your thin instances, but what I’m saying is in that case you need to accept that this makes decision theory and probably consequentialist ethics impossible in your framework.
It doesn’t matter whether you call your multiplier “probability” or “value” if it results in your decision to not care about low-measure branch. The only difference is that probability is supposed to be about knowledge, and Wallace’s argument involving arbitrary assumption, not only physics, means it’s not probability, but value—there is no reason to value knowledge of your low-measure instances less.
this makes decision theory and probably consequentialist ethics impossible in your framework
It doesn’t? Nothing stops you from making decisions in a world where you are constantly splitting. You can try to maximize splits of good experiences or something. It just wouldn’t be the same decisions you would make without knowledge of splits, but why new physical knowledge shouldn’t change your decisions?
OK ‘impossible’ is too strong, I should have said ‘extremely difficult’. That was my point in footnote 3 of the post. Most people would take the fact that it has implications like needing to “maximize splits of good experiences” (I assume you mean maximise the number of splits) as a reductio ad absurdum, due to the fact that this is massively different from our normal intuitions about what we should do. But some people have tried to take that approach, like in the article I mentioned in the footnote. If you or someone else can come up with a consistent and convincing decision approach that involves branch counting I would genuinely love to see it!
And counted branches.
His definition leads to contradiction with informal intuition that motivates consideration of macroscopical indistinguishability in the first place.
Why? Wallace’s argument is just “you don’t care about some irrelevant microscopic differences, so let me write this assumption that is superficially related to that preference, and here—it implies the Born rule”. Given MWI, there is nothing wrong physically or rationally in valuing your instances equally whatever their measure is. Their thoughts and experiences don’t depend on measure the same way they don’t depend on thickness or mass of a computer implementing them. You can rationally not care about irrelevant microscopic differences and still care about number of your thin instances.
I’m not at all saying the experiences of a person in a low-weight world are less valuable than a person in a high-weight world. Just that when you are considering possible futures in a decision-theoretic framework you need to apply the weights (because weight is equivalent to probability).
Wallace’s useful achievement in this context is to show that there exists a set of axioms that makes this work, and this includes branch-indifference.
This is useful because makes clear the way in which the branch-counting approach you’re suggesting is in conflict with decision theory. So I don’t disagree that you can care about the number of your thin instances, but what I’m saying is in that case you need to accept that this makes decision theory and probably consequentialist ethics impossible in your framework.
It doesn’t matter whether you call your multiplier “probability” or “value” if it results in your decision to not care about low-measure branch. The only difference is that probability is supposed to be about knowledge, and Wallace’s argument involving arbitrary assumption, not only physics, means it’s not probability, but value—there is no reason to value knowledge of your low-measure instances less.
It doesn’t? Nothing stops you from making decisions in a world where you are constantly splitting. You can try to maximize splits of good experiences or something. It just wouldn’t be the same decisions you would make without knowledge of splits, but why new physical knowledge shouldn’t change your decisions?
OK ‘impossible’ is too strong, I should have said ‘extremely difficult’. That was my point in footnote 3 of the post. Most people would take the fact that it has implications like needing to “maximize splits of good experiences” (I assume you mean maximise the number of splits) as a reductio ad absurdum, due to the fact that this is massively different from our normal intuitions about what we should do. But some people have tried to take that approach, like in the article I mentioned in the footnote. If you or someone else can come up with a consistent and convincing decision approach that involves branch counting I would genuinely love to see it!