Before I get going, please let me make clear that I do not
understand the math here (even Eliezer’s intuitive bayesian paper
defeated me on the first pass, and I haven’t yet had the courage to
take a second pass), so if I’m Missing The Point(tm), please tell
me.
It seems to me that what’s missing is talking about the probability
of given level of resourcefulness of the mugger. Let me ’splain.
If I ask the mugger for more detail, there are a wide variety of
different variables that determine how resourceful the mugger claims
to be. The mugger could, upon further questioning, reveal that all
the death events are the same entity being killed in the same way,
which I call one death; given the unlikelyhood of the mugger telling
the truth in the first place, I’d not pay. Similarily, the mugger
could reveal that the deaths, while of distinct entities, happen one
at a time, and may even include time for the entities to grow up and
become functioning adults (i.e. one death every 18 years), in which
case I can almost certainly put the money to better use by giving it
to SIAI.
On the other end of the scale, the mugger can claim infinite
resources, so that the can complete the deaths (of entirely distinct
entities, which have lives, grow up, and then are slaughtered) in an
infinitely small amount of time. If the mugger does so, they don’t
get the money, because I assign an infinitely small value to
probability of the mugger having infinite resources. Yes, the
mugger may live in a magical universe where having infinite
resources is easy, but you don’t get a
get-out-of-probability-assignment-free card because you say the word
“magic”; I still have to base my probability assignment of your
claims on the world around me, in which we don’t yet have the
computing power to simulate even one human in real time (ignoring
the software problem entirely).
Between these two extremes is an entire range of possibilities. The
important part here is that the probability I assign to “the mugger
is lying” is going to increase exponentially as their claim of
resources increases. Until the claimed rate of birth, growing, and
dying exceeds the rate of deaths we already have here on Earth, I
don’t care, because I can better spend the money here. After we
reach that point (~150K per day), I don’t care, because my
probability is something like 1/O(2^n) (Computer Science big-O
there; sorry, that’s my background) where n is the multiple of
computer resources claimed over “one mind in realtime”, so n is,
umm, 150K deaths per day = 53400000 deaths per year, 18 years for
each person, so I think n is 961200000?. That’s not even counting
the probability discount due to the ridiculousness of the whole
claim.
The point here is that I don’t care about the 3^^^^3 number; I only
care about the claimed deaths per unit time, how that compares to
the number of people currently dying on Earth (on whom I know I
can well-spend the $5) and the claimed resourcefulness of the
mugger. By the time we get up to where the 3^^^^3 number matters,
i.e. “I can kill one-onemillionth of 3^^^^3 people every realtime
year”, my probability assignment for their claimed resourcefulness
is so incredibly low (and so incredibly lower than the numbers they
are throwing at me) that I laugh and walk away.
There is not, as far as I can tell, a sweet spot where the number of
lives I might save by giving the mugger the $5 is enough more than
the number of people currently dying on Earth to offset the
ridiculously low probability I’d be assiging to the mugger’s
resourcefulness. I’d rather give the $5 to SIAI.
Before I get going, please let me make clear that I do not
understand the math here (even Eliezer’s intuitive bayesian paper
defeated me on the first pass, and I haven’t yet had the courage to
take a second pass), so if I’m Missing The Point(tm), please tell
me.
It seems to me that what’s missing is talking about the probability
of given level of resourcefulness of the mugger. Let me ’splain.
If I ask the mugger for more detail, there are a wide variety of
different variables that determine how resourceful the mugger claims
to be. The mugger could, upon further questioning, reveal that all
the death events are the same entity being killed in the same way,
which I call one death; given the unlikelyhood of the mugger telling
the truth in the first place, I’d not pay. Similarily, the mugger
could reveal that the deaths, while of distinct entities, happen one
at a time, and may even include time for the entities to grow up and
become functioning adults (i.e. one death every 18 years), in which
case I can almost certainly put the money to better use by giving it
to SIAI.
On the other end of the scale, the mugger can claim infinite
resources, so that the can complete the deaths (of entirely distinct
entities, which have lives, grow up, and then are slaughtered) in an
infinitely small amount of time. If the mugger does so, they don’t
get the money, because I assign an infinitely small value to
probability of the mugger having infinite resources. Yes, the
mugger may live in a magical universe where having infinite
resources is easy, but you don’t get a
get-out-of-probability-assignment-free card because you say the word
“magic”; I still have to base my probability assignment of your
claims on the world around me, in which we don’t yet have the
computing power to simulate even one human in real time (ignoring
the software problem entirely).
Between these two extremes is an entire range of possibilities. The
important part here is that the probability I assign to “the mugger
is lying” is going to increase exponentially as their claim of
resources increases. Until the claimed rate of birth, growing, and
dying exceeds the rate of deaths we already have here on Earth, I
don’t care, because I can better spend the money here. After we
reach that point (~150K per day), I don’t care, because my
probability is something like 1/O(2^n) (Computer Science big-O
there; sorry, that’s my background) where n is the multiple of
computer resources claimed over “one mind in realtime”, so n is,
umm, 150K deaths per day = 53400000 deaths per year, 18 years for
each person, so I think n is 961200000?. That’s not even counting
the probability discount due to the ridiculousness of the whole
claim.
The point here is that I don’t care about the 3^^^^3 number; I only
care about the claimed deaths per unit time, how that compares to
the number of people currently dying on Earth (on whom I know I
can well-spend the $5) and the claimed resourcefulness of the
mugger. By the time we get up to where the 3^^^^3 number matters,
i.e. “I can kill one-onemillionth of 3^^^^3 people every realtime
year”, my probability assignment for their claimed resourcefulness
is so incredibly low (and so incredibly lower than the numbers they
are throwing at me) that I laugh and walk away.
There is not, as far as I can tell, a sweet spot where the number of
lives I might save by giving the mugger the $5 is enough more than
the number of people currently dying on Earth to offset the
ridiculously low probability I’d be assiging to the mugger’s
resourcefulness. I’d rather give the $5 to SIAI.
-Robin