To try to understand your point, I will try to clarify it.
We have very limited access to our mental processes. In fact, in some cases our access to our mental processes is indirect—that is, we only discover what we believe once we have observed how we act. We observe our own act, and from this we can infer that we must have believed such-and-such. We can attempt to reconstruct our own process of thinking, but the process we are modeling is essentially a black box whose internals we are modeling, and the outputs of the black box at any given time are meager. We are of course always using the black box, which gives us a lot of data to go on in an absolute sense, but since the topic is constantly changing and since our beliefs are also in flux, the relevance of most of that data to the correct understanding of a particular act of thinking is unclear. In modeling our own mental processes we are rationalizing, with all the potential pitfalls associated with rationalization.
Nevertheless, this does not stop us from using the familiar gambling method for eliciting probability assessments, understood as willingness to wager. The gambling method, even if it is artificial, is at least reasonable, because every behavior we exhibit involves a kind of wager. However the black box operates, it will produce a certain response for each offered betting odds, from which its probability assignments can be derived. Of course this won’t work if the black box produces inconsistent (i.e. Dutch bookable) responses to the betting odds, but whether and to what degree it does or not is an empirical question. As a matter of fact, you’ve been talking about precision, and I think here’s how we can define the precision of your probability assignment. I’m sure that the black box’s responses to betting odds will be somewhat inconsistent. We can measure how inconsistent they are. There will be a certain gap of a certain size which can be Dutch booked—the bigger the gap the quicker you can be milked. And this will be the measure of the precision of your probability assignment.
But suppose that a person always in effect bets for something given certain odds or above, in whatever manner the bet is put to him, and always bets against if given odds anywhere below, and suppose the cutoff between his betting for and against is some very precise number such as pi to twelve digits. Then that seems to say that the odds his black box assigns is precisely those odds.
You write:
The problem is that the algorithms that your brain uses to perform common-sense reasoning are not transparent to your conscious mind, which has access only to their final output. This output does not provide a numerical probability estimate, but only a rough and vague feeling of certainty.
But I don’t we should be looking at introspectable “output”. The purpose of the brain isn’t to produce rough and vague feelings which we can then appreciate through inner contemplation. The purpose of the brain is to produce action, to decide on a course of action and then move the muscles accordingly. Our introspective power is limited at best. Over a lifetime of knowing ourselves we can probably get pretty good at knowing our own beliefs, but I don’t thing we should think of introspection as the gold standard of measuring a person’s belief. Like preference, belief is revealed in action. And action is what the gambling method of eliciting probability assignments looks at. While the brain produces only rough and vague feelings of certainty for the purposes of one’s own navel-gazing, at the same time it produces very definite behavior, very definite decisions, from which can be derived, at least in principle, probability assignments—and also, as I mention above, precision of those probability assignments.
I grant, by implication, that one’s own probability assignments are not necessarily introspectable. That goes without saying.
You write:
Therefore, there are only two ways in which you can arrive at a numerical probability estimate for a common-sense belief:
Translate your vague feeling of certainly into a number in some arbitrary manner. This however makes the number a mere figure of speech, which adds absolutely nothing over the usual human vague expressions for different levels of certainty.
Perform some probability calculation, which however has nothing to do with how your brain actually arrived at your common-sense conclusion, and then assign the probability number produced by the former to the latter. This is clearly fallacious.
Your first described way takes the vague feeling for the output of the black box. But the purpose of the black box is action, decision, and that is the output that we should be looking at, and it’s the output that the gambling method looks at. And that is a third way of arriving at a numerical probability which you didn’t cover.
Aside from some quibbles that aren’t really worth getting into, I have no significant disagreement with your comments. There is nothing wrong with looking at people’s acts in practice and observing that they behave as if they operated with subjective probability estimates in some range. However, your statement that “one’s own probability assignments are not necessarily introspectable” basically restates my main point, which was exactly about the meaninglessness of analyzing one’s own common-sense judgments to arrive at a numerical probability estimate, which many people here, in contrast, consider to be the right way to increase the accuracy of one’s thinking. (Though I admit that it should probably be worded more precisely to make sure it’s interpreted that way.)
However, your statement that “one’s own probability assignments are not necessarily introspectable” basically restates my main point, which was exactly about the meaninglessness of analyzing one’s own common-sense judgments to arrive at a numerical probability estimate, which many people here, in contrast, consider to be the right way to increase the accuracy of one’s thinking.
As it happens, early on I voted your initial comment down (following the topsy-turvy rules of the main post) because based on my first impression I thought I agreed with you. Reconsideration of your comment in light of the ensuing discussion brought to my mind this seeming objection. But you have disarmed the objection, so I am back to agreement.
To try to understand your point, I will try to clarify it.
We have very limited access to our mental processes. In fact, in some cases our access to our mental processes is indirect—that is, we only discover what we believe once we have observed how we act. We observe our own act, and from this we can infer that we must have believed such-and-such. We can attempt to reconstruct our own process of thinking, but the process we are modeling is essentially a black box whose internals we are modeling, and the outputs of the black box at any given time are meager. We are of course always using the black box, which gives us a lot of data to go on in an absolute sense, but since the topic is constantly changing and since our beliefs are also in flux, the relevance of most of that data to the correct understanding of a particular act of thinking is unclear. In modeling our own mental processes we are rationalizing, with all the potential pitfalls associated with rationalization.
Nevertheless, this does not stop us from using the familiar gambling method for eliciting probability assessments, understood as willingness to wager. The gambling method, even if it is artificial, is at least reasonable, because every behavior we exhibit involves a kind of wager. However the black box operates, it will produce a certain response for each offered betting odds, from which its probability assignments can be derived. Of course this won’t work if the black box produces inconsistent (i.e. Dutch bookable) responses to the betting odds, but whether and to what degree it does or not is an empirical question. As a matter of fact, you’ve been talking about precision, and I think here’s how we can define the precision of your probability assignment. I’m sure that the black box’s responses to betting odds will be somewhat inconsistent. We can measure how inconsistent they are. There will be a certain gap of a certain size which can be Dutch booked—the bigger the gap the quicker you can be milked. And this will be the measure of the precision of your probability assignment.
But suppose that a person always in effect bets for something given certain odds or above, in whatever manner the bet is put to him, and always bets against if given odds anywhere below, and suppose the cutoff between his betting for and against is some very precise number such as pi to twelve digits. Then that seems to say that the odds his black box assigns is precisely those odds.
You write:
But I don’t we should be looking at introspectable “output”. The purpose of the brain isn’t to produce rough and vague feelings which we can then appreciate through inner contemplation. The purpose of the brain is to produce action, to decide on a course of action and then move the muscles accordingly. Our introspective power is limited at best. Over a lifetime of knowing ourselves we can probably get pretty good at knowing our own beliefs, but I don’t thing we should think of introspection as the gold standard of measuring a person’s belief. Like preference, belief is revealed in action. And action is what the gambling method of eliciting probability assignments looks at. While the brain produces only rough and vague feelings of certainty for the purposes of one’s own navel-gazing, at the same time it produces very definite behavior, very definite decisions, from which can be derived, at least in principle, probability assignments—and also, as I mention above, precision of those probability assignments.
I grant, by implication, that one’s own probability assignments are not necessarily introspectable. That goes without saying.
You write:
Your first described way takes the vague feeling for the output of the black box. But the purpose of the black box is action, decision, and that is the output that we should be looking at, and it’s the output that the gambling method looks at. And that is a third way of arriving at a numerical probability which you didn’t cover.
Aside from some quibbles that aren’t really worth getting into, I have no significant disagreement with your comments. There is nothing wrong with looking at people’s acts in practice and observing that they behave as if they operated with subjective probability estimates in some range. However, your statement that “one’s own probability assignments are not necessarily introspectable” basically restates my main point, which was exactly about the meaninglessness of analyzing one’s own common-sense judgments to arrive at a numerical probability estimate, which many people here, in contrast, consider to be the right way to increase the accuracy of one’s thinking. (Though I admit that it should probably be worded more precisely to make sure it’s interpreted that way.)
As it happens, early on I voted your initial comment down (following the topsy-turvy rules of the main post) because based on my first impression I thought I agreed with you. Reconsideration of your comment in light of the ensuing discussion brought to my mind this seeming objection. But you have disarmed the objection, so I am back to agreement.