I think that Wei has a point: it is in principle possible to hold preferences and an epistemology such that, via his link, you are contradicting yourself.
For example, if you believe SIA and think that you should be a utility-maximizer, then you are committed to risking a 50% probability of killing someone to save $1, which many people may find highly counter-intuitive.
I think that Wei has a point: it is in principle possible to hold preferences and an epistemology such that, via his link, you are contradicting yourself.
For example, if you believe SIA and think that you should be a utility-maximizer, then you are committed to risking a 50% probability of killing someone to save $1, which many people may find highly counter-intuitive.
Er, how does that follow?