Not at all. I wouldn’t trade any secular value for Frank’s life, but if I got a deal saying that Frank might die (or live) at a probability of 1/3^^^3, I’d be more curious about how on earth even Omega can get that level of precision than actually worried about Frank.
Not at all. I wouldn’t trade any secular value for Frank’s life
Eh? Do you mean you wouldn’t make the trade at any probability? That would be weird; everyone makes decisions every day that put other people in small probabilities of danger.
Well of course. That’s why I put this in a white room.
(Also, just because I should choose something doesn’t mean I’m actually rational enough to choose it.)
Assuming I am perfectly rational (*cough* *cough*) in the real world, the decision I’m actually making is “some fraction of myself living” versus “small probability of someone else dying.”
Not at all. I wouldn’t trade any secular value for Frank’s life, but if I got a deal saying that Frank might die (or live) at a probability of 1/3^^^3, I’d be more curious about how on earth even Omega can get that level of precision than actually worried about Frank.
Eh? Do you mean you wouldn’t make the trade at any probability? That would be weird; everyone makes decisions every day that put other people in small probabilities of danger.
Well of course. That’s why I put this in a white room.
(Also, just because I should choose something doesn’t mean I’m actually rational enough to choose it.)
Assuming I am perfectly rational (*cough* *cough*) in the real world, the decision I’m actually making is “some fraction of myself living” versus “small probability of someone else dying.”