In some sense, basically everywhere there is a very-low or very-high probability belief, since obviously I can’t be more confident in any belief than I can be in the reliableness of my system of reasoning. I definitely consider this when I’m evaluating the proper strength of nearly-certain beliefs. In another sense, almost nowhere.
I don’t know exactly how confident I should be in my sanity, except that the probability of insanity is small. Also, I’m not confident there would be any evidence distinguishing ‘sane and rational’ from ‘insane but apparently rational’. I model a logical-insane VAuroch as being like the anti-inductors; following different rules which, according to their own standards, are self-consistent.
Since I can’t determine how to quantify it, my response has been to treat all other beliefs as conditioned on “my reasoning process is basically sound”, which makes a fair number of my beliefs having tacit probability 1; if I find reason to question any of these beliefs, I will have to rederive every belief from the original evidence as much as possible, because it’s exposed a significant flaw in the means by which I determine what beliefs to hold. Largely this consists of mathematical proofs, but also things like “there is not currently a flying green elephant in this room” and “an extant rain god is mutually incompatible with reductionism”.
Since I can’t determine how to quantify it, my response has been to treat all other beliefs as conditioned on “my reasoning process is basically sound”, which makes a fair number of my beliefs having tacit probability 1; if I find reason to question any of these beliefs, I will have to rederive every belief from the original evidence as much as possible, because it’s exposed a significant flaw in the means by which I determine what beliefs to hold. Largely this consists of mathematical proofs, but also things like “there is not currently a flying green elephant in this room” and “an extant rain god is mutually incompatible with reductionism”.
This is an amazingly apt description of the mind-state that Robert Anton Wilson called “Chapel Perilous”.
It is interesting that you think so, but I can’t make head or tail of his description of the state, and other descriptions don’t bear any particular resemblance to the state of mind I describe.
My position on the matter boils down to “All my beliefs may be unjustified, but until I have evidence suggesting they are, I should provisionally assume the opposite, because worrying about it is counterproductive.”
In some sense, basically everywhere there is a very-low or very-high probability belief, since obviously I can’t be more confident in any belief than I can be in the reliableness of my system of reasoning. I definitely consider this when I’m evaluating the proper strength of nearly-certain beliefs. In another sense, almost nowhere.
I don’t know exactly how confident I should be in my sanity, except that the probability of insanity is small. Also, I’m not confident there would be any evidence distinguishing ‘sane and rational’ from ‘insane but apparently rational’. I model a logical-insane VAuroch as being like the anti-inductors; following different rules which, according to their own standards, are self-consistent.
Since I can’t determine how to quantify it, my response has been to treat all other beliefs as conditioned on “my reasoning process is basically sound”, which makes a fair number of my beliefs having tacit probability 1; if I find reason to question any of these beliefs, I will have to rederive every belief from the original evidence as much as possible, because it’s exposed a significant flaw in the means by which I determine what beliefs to hold. Largely this consists of mathematical proofs, but also things like “there is not currently a flying green elephant in this room” and “an extant rain god is mutually incompatible with reductionism”.
This is an amazingly apt description of the mind-state that Robert Anton Wilson called “Chapel Perilous”.
It is interesting that you think so, but I can’t make head or tail of his description of the state, and other descriptions don’t bear any particular resemblance to the state of mind I describe.
My position on the matter boils down to “All my beliefs may be unjustified, but until I have evidence suggesting they are, I should provisionally assume the opposite, because worrying about it is counterproductive.”