To be more specific, if a stranger approached me, offering a deal saying, “I am the creator of the Matrix. If you fall on your knees, praise me and kiss my feet, I’ll use my magic powers from outside the Matrix to run a Turing machine that simulates 3^^^^3 copies of you having their coherent extrapolated volition satisfied maximally for 3^^^^3 years.” Why exactly would I penalize this offer by the amount of copies being offered to be simulated?
There will be a penalty, and a large one, but it isn’t going to be directly dependent on the number of people. Compare the following:
A: There is an avatar from outside the Matrix who is able and possibly willing to simulate 3^^^^3 copies of you. B: There is an avatar from outside the Matrix who is able and possibly willing to simulate BusyBeaver(3^^^^3) copies of you.
P(A) is greater than P(B). But it is not even remotely BusyBeaver(3^^^^3)/3^^^^3 times greater than P(B). Most of the improbability is in there ridiculous out of the matrix trickster, not the extent of his trickery. If it is lack symmetry that is taken to be the loophole it is easy enough to modify the scenario to make things symmetrical. Trickster: “I’ll simulate this universe up to now a gazillion times and after now the sims get lots of utility”. Concern about how surprising it is for you to be one with the power becomes irrelevant.
Is the Pascal’s mugging thought experiment a “reduction to the absurd” of Bayes’ Theorem in combination with the expected utility formula and Solomonoff induction?
I consider expected utility maximisation to be a preference. A fairly sane sounding preference but still just as ‘arbitrary’ as whether you prefer to have mass orgasms or eat babies. (The ‘expected’ part, that is.)
There will be a penalty, and a large one, but it isn’t going to be directly dependent on the number of people. Compare the following:
A: There is an avatar from outside the Matrix who is able and possibly willing to simulate 3^^^^3 copies of you.
B: There is an avatar from outside the Matrix who is able and possibly willing to simulate BusyBeaver(3^^^^3) copies of you.
P(A) is greater than P(B). But it is not even remotely BusyBeaver(3^^^^3)/3^^^^3 times greater than P(B). Most of the improbability is in there ridiculous out of the matrix trickster, not the extent of his trickery. If it is lack symmetry that is taken to be the loophole it is easy enough to modify the scenario to make things symmetrical. Trickster: “I’ll simulate this universe up to now a gazillion times and after now the sims get lots of utility”. Concern about how surprising it is for you to be one with the power becomes irrelevant.
I consider expected utility maximisation to be a preference. A fairly sane sounding preference but still just as ‘arbitrary’ as whether you prefer to have mass orgasms or eat babies. (The ‘expected’ part, that is.)