First off, you have to assume that, as a prior, you’re just as likely to be anyone.
A meaningless question. If you can be anyone, then you are everyone at the same time, in the sense of controlling everyone’s decisions (with some probability). It’s not mutually exclusive that your decision controls one person and another person, so even though the prior probability of controlling any given person is low (that is, given what your decisions would be in some collection of situation, we can expect a low probability of them all determining that person’s decisions), and the total number of people is high, the probabilities of controlling all available people don’t have to add up to 1 (the sum could be far below 1, if you are in fact neither of these people, maybe you don’t exist in this world; or far above 1, if all people are in fact your near-copies).
This isn’t about your decision controlling them. It’s about the information gained by knowing you’re the nth person. The fact that you might not be a person doesn’t really matter, since you have to be a person for any of the possibilities mentioned.
the sum could be far below 1, if you are in fact neither of these people
P(U=m|T=n)∝1/n
That should be
a = number of nonhuman sentients.
P(U=m|T=n)∝1/(a+n)
which approaches a constant as a increases without limit.
Oops. I’ve checked this several times, and hadn’t seen that.
Let this be a lesson. When quadrillions of lives are counting on it, make sure someone double-checks your math.
I’m going to consider the ramifications of this for a while. This argument might still apply significantly. It might not.
A meaningless question. If you can be anyone, then you are everyone at the same time, in the sense of controlling everyone’s decisions (with some probability). It’s not mutually exclusive that your decision controls one person and another person, so even though the prior probability of controlling any given person is low (that is, given what your decisions would be in some collection of situation, we can expect a low probability of them all determining that person’s decisions), and the total number of people is high, the probabilities of controlling all available people don’t have to add up to 1 (the sum could be far below 1, if you are in fact neither of these people, maybe you don’t exist in this world; or far above 1, if all people are in fact your near-copies).
See
http://lesswrong.com/lw/182/the_absentminded_driver/
http://lesswrong.com/lw/2os/controlling_constant_programs/
http://singinst.org/upload/TDT-v01o.pdf
This isn’t about your decision controlling them. It’s about the information gained by knowing you’re the nth person. The fact that you might not be a person doesn’t really matter, since you have to be a person for any of the possibilities mentioned.
That should be
a = number of nonhuman sentients.
P(U=m|T=n)∝1/(a+n)
which approaches a constant as a increases without limit.
Oops. I’ve checked this several times, and hadn’t seen that.
Let this be a lesson. When quadrillions of lives are counting on it, make sure someone double-checks your math.
I’m going to consider the ramifications of this for a while. This argument might still apply significantly. It might not.