You’re right, and I won’t argue it. The idea of not impossible is one I have difficulty with, though. In my original post, replace with , for lack of a better alternative. With anosognosia, that thing is “recognize left-arm paralysis”. The reason I didn’t stick with that is because I don’t know if I have anosognosia or not, which is another layer of uncertainty. Stripped down, though, this is what I’m saying: it seems I should be uncertain about things I know to be certain, and that seems dishonest. I understand the argument against infinite certainty, and that 0 And 1 Are Not Probabilities. Perhaps it’s because, as EY suggests, people often say “I can’t be certain” simply to establish themselves as rational rather than actually assessing probability. Perhaps it’s simply because I dislike an infinitely uncertain universe. Of course, the universe isn’t interested in what I like. The map, as ever, is not the territory.
You should say that something is impossible, without intending that to mean zero probability, if you can safely antipredict that event. Antiprediction means that you think of an event as if it can’t happen. Intuition resulting from thinking of a sufficiently low-probability event as impossible is more accurate than intuition resulting from thinking of it as still possible.
Antiprediction is a very interesting suggestion. Your aggressive reasoning in this thread has changed the way I think about a few things. Well done, and thanks!
Even if no examples of this were available, it’s not the kind of evidence that is enough to claim that something is impossible.
You’re right, and I won’t argue it. The idea of not impossible is one I have difficulty with, though. In my original post, replace with , for lack of a better alternative. With anosognosia, that thing is “recognize left-arm paralysis”. The reason I didn’t stick with that is because I don’t know if I have anosognosia or not, which is another layer of uncertainty. Stripped down, though, this is what I’m saying: it seems I should be uncertain about things I know to be certain, and that seems dishonest. I understand the argument against infinite certainty, and that 0 And 1 Are Not Probabilities. Perhaps it’s because, as EY suggests, people often say “I can’t be certain” simply to establish themselves as rational rather than actually assessing probability. Perhaps it’s simply because I dislike an infinitely uncertain universe. Of course, the universe isn’t interested in what I like. The map, as ever, is not the territory.
You should say that something is impossible, without intending that to mean zero probability, if you can safely antipredict that event. Antiprediction means that you think of an event as if it can’t happen. Intuition resulting from thinking of a sufficiently low-probability event as impossible is more accurate than intuition resulting from thinking of it as still possible.
Antiprediction is a very interesting suggestion. Your aggressive reasoning in this thread has changed the way I think about a few things. Well done, and thanks!