It does seem that the probability of someone being able to bring about the deaths of N people should scale as 1/N, or at least 1/f(N) for some monotonically increasing function f. 3^^^^3 may be a more simply specified number than 1697, but it seems “intuitively obvious” (as much as that means anything) that it’s easier to kill 1697 people than 3^^^^3. Under this reasoning, the likely deaths caused by not giving the mugger $5 are something like N/f(N), which depends on what f is, but it seems likely that it converges to zero as N increases.
It is an awfully difficult question, though, because how do we know we don’t live in a world where 3^^^^3 people could die at any moment? It seems unlikely, but then so do a lot of things that are real.
Perhaps the problem lies in the idea that a Turing machine can create entities that have the moral status of humans. If there’s a machine out there that can create and destroy 3^^^^3 humans on a whim, then are human lives really worth that much? But, on the other hand, there are laws of physics out there that have been demonstrated to create almost 3^^3 humans, so what is one human life worth on that scale?
On another note, my girlfriend says that if someone tried this on her, she’d probably give them the $5 just for the laugh she got out of it. It would probably only work once, though.
It does seem that the probability of someone being able to bring about the deaths of N people should scale as 1/N, or at least 1/f(N) for some monotonically increasing function f. 3^^^^3 may be a more simply specified number than 1697, but it seems “intuitively obvious” (as much as that means anything) that it’s easier to kill 1697 people than 3^^^^3. Under this reasoning, the likely deaths caused by not giving the mugger $5 are something like N/f(N), which depends on what f is, but it seems likely that it converges to zero as N increases.
It is an awfully difficult question, though, because how do we know we don’t live in a world where 3^^^^3 people could die at any moment? It seems unlikely, but then so do a lot of things that are real.
Perhaps the problem lies in the idea that a Turing machine can create entities that have the moral status of humans. If there’s a machine out there that can create and destroy 3^^^^3 humans on a whim, then are human lives really worth that much? But, on the other hand, there are laws of physics out there that have been demonstrated to create almost 3^^3 humans, so what is one human life worth on that scale?
On another note, my girlfriend says that if someone tried this on her, she’d probably give them the $5 just for the laugh she got out of it. It would probably only work once, though.