Does the MWI make rationality irrelevant? All choices are done in some universe (because there’s atleast one extremely improbable quantum event which arranges the particles in your brain to make any choice). Therefore, you will make the correct choice in atleast 1 universe.
Of course, this leads to the problems of continuing conscious experience (or the lack of), and whether you should care of what happens to you in all the possible future worlds that you will exist in.
That doesn’t just make rationality irrelevant, it makes everything irrelevant. Love doesn’t matter because you don’t meet that special someone in every world, and will meet them in at least one world. Education doesn’t matter because guessing will get you right somewhere.
I want to be happy and right in as many worlds as possible. Rationality matters.
This might be easier to consider as the simpler case of “given we live in a deterministic universe, what does any choice I make matter?” I would say that I still have to make decisions of how to act and choosing not to act is also a choice, so I should do what ever it is that I want to do.
That’s not the same problem, though Egan’s Law is equally applicable to both. An agent might have no confusion over free will, have clear preferences and act normally on them in a single deterministic world, but not care about quantum measure and thus be a nihilist in many-worlds. (Actually, if such an agent seems to be in MW, it should by its preferences proceed under the Pascalian assumption that it lives in a single world and is being deceived.)
Does the MWI make rationality irrelevant? All choices are done in some universe (because there’s atleast one extremely improbable quantum event which arranges the particles in your brain to make any choice). Therefore, you will make the correct choice in atleast 1 universe.
Of course, this leads to the problems of continuing conscious experience (or the lack of), and whether you should care of what happens to you in all the possible future worlds that you will exist in.
That doesn’t just make rationality irrelevant, it makes everything irrelevant. Love doesn’t matter because you don’t meet that special someone in every world, and will meet them in at least one world. Education doesn’t matter because guessing will get you right somewhere.
I want to be happy and right in as many worlds as possible. Rationality matters.
I want more copies of me to make the correct choice.
Cf. this thread, which is relevant here.
You’d want to make the correct choice in future worlds. What are the chances of you being in that one world where that happens?
This might be easier to consider as the simpler case of “given we live in a deterministic universe, what does any choice I make matter?” I would say that I still have to make decisions of how to act and choosing not to act is also a choice, so I should do what ever it is that I want to do.
http://wiki.lesswrong.com/wiki/Free_will
That’s not the same problem, though Egan’s Law is equally applicable to both. An agent might have no confusion over free will, have clear preferences and act normally on them in a single deterministic world, but not care about quantum measure and thus be a nihilist in many-worlds. (Actually, if such an agent seems to be in MW, it should by its preferences proceed under the Pascalian assumption that it lives in a single world and is being deceived.)
Nick Bostrom has a couple of papers on this:
Infinite Ethics
Quantity of Experience: Brain-Duplication and Degrees of Consciousness
Could you explain that more? As far as I can see, an agent which doesn’t care about measure would engage in high rate quantum suicide.