I would choose that knowledge if there was the chance that it wouldn’t find out about it. As far as I understand your knowledge of the dangerous truth, it just increases the likelihood of suffering, it doesn’t make it guaranteed.
I don’t understand your reasoning here—bad events don’t get a “flawless victory” badness bonus for being guaranteed. A 100% chance of something bad isn’t much worse than a 90% chance.
I said that I wouldn’t want to know it if a bad outcome was guaranteed. But if it would make a bad outcome possible, but very-very-unlikely to actually occur, then the utility I assign to knowing the truth would outweigh the very unlikely possibility of something bad happening.
I don’t understand your reasoning here—bad events don’t get a “flawless victory” badness bonus for being guaranteed. A 100% chance of something bad isn’t much worse than a 90% chance.
I said that I wouldn’t want to know it if a bad outcome was guaranteed. But if it would make a bad outcome possible, but very-very-unlikely to actually occur, then the utility I assign to knowing the truth would outweigh the very unlikely possibility of something bad happening.
No, dude, you’re wrong