I’ve been arguing about this with a friend recently [well, a version of this—I don’t have any problems with arbitrarily large number of people being created and killed, unless the manner of their death is unpleasant enough that the negative value I assign to it exceeds the positive value of life].
He says that he can believe the person we are talking to has Agent Smith powers, but thinks that the more the Agent Smith promises, the less likely it is to be true, and this decreases faster the more that is promised, so that the probability that Agent Smith has the powers to create and kill [in an unpleasant manner] Y people multiplied by Y tends to zero as Y tends to infinity. . So the net expectancy tends towards zero. I disagree with this: I believe that if you assign probability X to the claim that the person you are talking to is genuinely from outside the Matrix [and that you’re in the Matrix], then the probability that Agent Smith has the powers to create and kill [in an unpleasant manner] Y people multiplied by Y tends to infinity as Y tends to infinity.
Now, I think we can break this down further to find the root cause of our disagreement [this doesn’t feel like a fundamental belief]: does anyone have any suggestions for how to go about doing this? We began to argue about entropy and the chance for Agent Smith to have found a way [from outside the Matrix = all our physics doesn’t apply to him] to reverse it, but I think we went downhill from there.
Edit: Looks like I was assuming probability distributions for which Lim (Y → infinity) of Y*P(Y) is well defined. This turns out to be monotonic series or some similar class (thanks shinoteki).
I think it’s still the case that a probability distribution that would lead to TraderJoe’s claim of P(Y)*Y tending to infinity as Y grows would be un-normalizable. You can of course have a distribution for which this limit is undefined, but that’s a different story.
Counterexample:
P(3^^^...3)(n “^”s) = 1/2^n
P(anything else) = 0
This is normalized because the sum of a geometric series with decreasing terms is finite.
You might have been thinking of the fact that if a probability distribution on the integers is monotone decreasing (i.e. if P(n)>P(m) then n <m) then P(n) must decrease faster than 1/n. However, a complexity-based distribution will not be monotone because some big numbers are simple while most of them are complex.
I’ve been arguing about this with a friend recently [well, a version of this—I don’t have any problems with arbitrarily large number of people being created and killed, unless the manner of their death is unpleasant enough that the negative value I assign to it exceeds the positive value of life].
He says that he can believe the person we are talking to has Agent Smith powers, but thinks that the more the Agent Smith promises, the less likely it is to be true, and this decreases faster the more that is promised, so that the probability that Agent Smith has the powers to create and kill [in an unpleasant manner] Y people multiplied by Y tends to zero as Y tends to infinity. . So the net expectancy tends towards zero. I disagree with this: I believe that if you assign probability X to the claim that the person you are talking to is genuinely from outside the Matrix [and that you’re in the Matrix], then the probability that Agent Smith has the powers to create and kill [in an unpleasant manner] Y people multiplied by Y tends to infinity as Y tends to infinity.
Now, I think we can break this down further to find the root cause of our disagreement [this doesn’t feel like a fundamental belief]: does anyone have any suggestions for how to go about doing this? We began to argue about entropy and the chance for Agent Smith to have found a way [from outside the Matrix = all our physics doesn’t apply to him] to reverse it, but I think we went downhill from there.
Edit: Looks like I was assuming probability distributions for which Lim (Y → infinity) of Y*P(Y) is well defined. This turns out to be monotonic series or some similar class (thanks shinoteki).
I think it’s still the case that a probability distribution that would lead to TraderJoe’s claim of P(Y)*Y tending to infinity as Y grows would be un-normalizable. You can of course have a distribution for which this limit is undefined, but that’s a different story.
Counterexample: P(3^^^...3)(n “^”s) = 1/2^n P(anything else) = 0 This is normalized because the sum of a geometric series with decreasing terms is finite. You might have been thinking of the fact that if a probability distribution on the integers is monotone decreasing (i.e. if P(n)>P(m) then n <m) then P(n) must decrease faster than 1/n. However, a complexity-based distribution will not be monotone because some big numbers are simple while most of them are complex.