@billswift: Emotion might drive every human action (or not). That’s beside the point. If an emotion drives you into a dead end, there’s something wrong with that emotion.
My point was that if someone tells you the truth and you don’t believe them, it’s not fair to say they’ve led you astray. Eliezer said he didn’t “emotionally believe” a truth he was told, even though he knew it was true. I’m not sure what that means, but it sounds like a problem with Eliezer, involving his emotions, not a problem with what he was told.
On average, if you eliminate twice as many hypotheses as I do from the same data, how much more data than you do I need to achieve the same results? Does it depend on how close we are to the theoretical maximum?