As I said, I don’t think MWI leads to really large numbers of copies; back-of-the-envelope calculations suggest it should be “closer to” 3^^^3 or googlplex than to 3^^^^3. So yes: I tried to indicate that this idea does NOT solve the dilemma on its own. However, even if 3^^^^3 is so big as to make 3^^^3 look tiny, the latter is still not negligible, and deserves at least a mention. If Eleizer had mentioned it and dismissed it, I would have no objection. But I think it is notable that he did not.
For instance: Say that there earthwormchuck163 is right and there are fewer than 3^^^^3 intelligent beings possible before you start to duplicate. For instance say it’s (x^^^x)^y, and that due to MWI there are (x^^^x) copies of a regular human spawned per fortnight. So MWI is reducing Matrix Lord’s threat from (x^^^x)^y to (x^^^x)^(y-1). Doesn’t seem like a big change; but if you suppose that only one of them is decisive for this particular Matrix Lord threat, you’ve just changed the cost/benefit ratio from order-of-1 to order-of-1/(x^^^x), which is a big shift.
I know that there are a number of possible objections to that specific argument. For instance, it’s relying on the symmetry of intelligence; if Matrix Lord were offering 3^^^^3 paperclips to clippy, it wouldn’t help figure out the clipperific thing to do. The intent is not to make a convincing argument, but simply to demonstrate that a factor on the order of x^^^x can in principle be significant, even when the threat is on the order of 3^^^^3.
As I said, I don’t think MWI leads to really large numbers of copies; back-of-the-envelope calculations suggest it should be “closer to” 3^^^3 or googlplex than to 3^^^^3. So yes: I tried to indicate that this idea does NOT solve the dilemma on its own. However, even if 3^^^^3 is so big as to make 3^^^3 look tiny, the latter is still not negligible, and deserves at least a mention. If Eleizer had mentioned it and dismissed it, I would have no objection. But I think it is notable that he did not.
For instance: Say that there earthwormchuck163 is right and there are fewer than 3^^^^3 intelligent beings possible before you start to duplicate. For instance say it’s (x^^^x)^y, and that due to MWI there are (x^^^x) copies of a regular human spawned per fortnight. So MWI is reducing Matrix Lord’s threat from (x^^^x)^y to (x^^^x)^(y-1). Doesn’t seem like a big change; but if you suppose that only one of them is decisive for this particular Matrix Lord threat, you’ve just changed the cost/benefit ratio from order-of-1 to order-of-1/(x^^^x), which is a big shift.
I know that there are a number of possible objections to that specific argument. For instance, it’s relying on the symmetry of intelligence; if Matrix Lord were offering 3^^^^3 paperclips to clippy, it wouldn’t help figure out the clipperific thing to do. The intent is not to make a convincing argument, but simply to demonstrate that a factor on the order of x^^^x can in principle be significant, even when the threat is on the order of 3^^^^3.