I think that is part of my point, but my main point was that many theories can receive this treatment.
For example, suppose you believe in a Big World such that every physically possible thing happens somewhere/somewhen in it.
And suppose you believe that there is a teapot in orbit between Mars and Jupiter.
Couldn’t you “prove” your belief by saying “what is probability anyway,” pointing out that there are infinitely many copies of you which live in solar systems with teapots between Mars and Jupiter, and saying that you value those copies more than everyone else? Not because you value any one person more than any other, of course—you value everybody equally—but because of the measure you assign over all copies of you in the Big World.
Do you think there is a principled difference between the scenario I just described, and what you are doing with Measureless Multiverse theory? If you say no, you aren’t sunk—after all, perhaps MMtheory is more plausible for other reasons than the Big World theory I described.
My answer is no, at least objectively. There is a little caveat here that is related to Eliezer’s theory of meta ethics. It is exactly the same as the way I say no, there is no principled reason why killing is bad. From my point of view, killing really is bad, and the fact that I think it is bad is not what causes it to be bad. Similarly, from my point of view simple things are more important, and If I were to change my mind about that, they would not stop being more important.
Okay. Well, this seems to me to be a bad mark against Measureless Multiverse theory.
If it can only be made to add up to normality by pulling a move that could equally well be used to make pretty much any arbitrary belief system add up to normality… then the fact that it adds up to normality is not something that counts in favor of the theory.
Perhaps you say, fair enough—there are plenty of other things which count in favor of the theory. But I worry. This move makes adding up to normality a cheap, plentiful feature that many many theories share, and that seems dangerous.
Suppose our mathematical abilities advance to the point where we can take measures/languages and calculate the predictions they make, at least to approximation or something. It might turn out that society is split on which simplicity prior to use, and thus society is split about which predictions to make in some big hypothetical experiment. (I’m imagining a big collider.) Under MMtheory, this would just be an ethical disagreement, one that in fact would not be resolved, or influenced in any way, by performing the experiment. The people who turned out to be “wrong” would simply say “Oh, so I guess I’m in a more complicated world after all. But this doesn’t conflict with my predictions, since I didn’t make any predictions.”
What do you think about this issue? Do you think I made a mistake somewhere?
EDIT: Or was I massively unclear? Rereading, I think that might be the case. I’d be happy to rewrite if you like, but since I’m busy now I’ll just hope that it is comprehensible to you.
I think that is part of my point, but my main point was that many theories can receive this treatment.
For example, suppose you believe in a Big World such that every physically possible thing happens somewhere/somewhen in it.
And suppose you believe that there is a teapot in orbit between Mars and Jupiter.
Couldn’t you “prove” your belief by saying “what is probability anyway,” pointing out that there are infinitely many copies of you which live in solar systems with teapots between Mars and Jupiter, and saying that you value those copies more than everyone else? Not because you value any one person more than any other, of course—you value everybody equally—but because of the measure you assign over all copies of you in the Big World.
Do you think there is a principled difference between the scenario I just described, and what you are doing with Measureless Multiverse theory? If you say no, you aren’t sunk—after all, perhaps MMtheory is more plausible for other reasons than the Big World theory I described.
My answer is no, at least objectively. There is a little caveat here that is related to Eliezer’s theory of meta ethics. It is exactly the same as the way I say no, there is no principled reason why killing is bad. From my point of view, killing really is bad, and the fact that I think it is bad is not what causes it to be bad. Similarly, from my point of view simple things are more important, and If I were to change my mind about that, they would not stop being more important.
Okay. Well, this seems to me to be a bad mark against Measureless Multiverse theory.
If it can only be made to add up to normality by pulling a move that could equally well be used to make pretty much any arbitrary belief system add up to normality… then the fact that it adds up to normality is not something that counts in favor of the theory. Perhaps you say, fair enough—there are plenty of other things which count in favor of the theory. But I worry. This move makes adding up to normality a cheap, plentiful feature that many many theories share, and that seems dangerous.
Suppose our mathematical abilities advance to the point where we can take measures/languages and calculate the predictions they make, at least to approximation or something. It might turn out that society is split on which simplicity prior to use, and thus society is split about which predictions to make in some big hypothetical experiment. (I’m imagining a big collider.) Under MMtheory, this would just be an ethical disagreement, one that in fact would not be resolved, or influenced in any way, by performing the experiment. The people who turned out to be “wrong” would simply say “Oh, so I guess I’m in a more complicated world after all. But this doesn’t conflict with my predictions, since I didn’t make any predictions.”
What do you think about this issue? Do you think I made a mistake somewhere?
EDIT: Or was I massively unclear? Rereading, I think that might be the case. I’d be happy to rewrite if you like, but since I’m busy now I’ll just hope that it is comprehensible to you.