An earlier version of my analysis (the previous blog post) looked at the case of finite n and found, as you suggest, that the possibility of running out of people to kidnap is an important consideration. You can choose the number of batches n to be so large that it is virtually certain a priori that the madman will eventually murder:
P(eventually murders) = 1 - epsilon for some small epsilon
However, it turns out that conditioning on the fact that you are kidnapped changes the probability dramatically:
P(eventually murders | you are kidnapped) = about 10⁄9 * 1⁄36
The reason for this is that there are about 9 times as many people in the final batch as in all other batches combined, so the fact that you are kidnapped is strong evidence that the madman is on his last batch of potential victims.
An earlier version of my analysis (the previous blog post) looked at the case of finite n and found, as you suggest, that the possibility of running out of people to kidnap is an important consideration. You can choose the number of batches n to be so large that it is virtually certain a priori that the madman will eventually murder:
P(eventually murders) = 1 - epsilon for some small epsilon
However, it turns out that conditioning on the fact that you are kidnapped changes the probability dramatically:
P(eventually murders | you are kidnapped) = about 10⁄9 * 1⁄36
The reason for this is that there are about 9 times as many people in the final batch as in all other batches combined, so the fact that you are kidnapped is strong evidence that the madman is on his last batch of potential victims.