I don’t agree with this conflation of commitment and belief. I’ve never had to run from a predator, but when I run to catch a train, I am fully committed to catching the train, although I may be uncertain about whether I will succeed. In fact, the less time I have, the faster I must run, but the less likely I am to catch the train. That only affects my decision to run or not. On making the decision, belief and uncertainty are irrelevant, intention and action are everything.
Maybe some people have to make themselves believe in an outcome they know to be uncertain, in order to achieve it, but that is just a psychological exercise, not a necessary part of action.
The question is not whether there are some examples of commitment which do not involve belief. The question is whether there are (some, many) examples where really, absolutely full commitment does involve belief. I think there are many.
Consider what commitment is. If someone says, “you don’t seem fully committed to this”, what sort of thing might have prompted him to say this? It’s something like, he thinks you aren’t doing everything you could possibly do to help this along. He thinks you are holding back.
You might reply to this criticism, “I am not holding anything back. There is literally nothing more that I can do to further the probability of success, so there is no point in doing more—it would be an empty and possibly counterproductive gesture rather than being an action that truly furthers the chance of success.”
So the important question is, what can a creature do to further the probability of success? Let’s look at you running to catch the train. You claim that believing that you will succeed would not further the success of your effort. Well, of course not! I could have told you that! If you believe that you will succeed, you can become complacent, which runs the risk of slowing you down.
But if you believe that there is something chasing you, that is likely to speed you up.
Your argument is essentially, “my full commitment didn’t involve belief X, therefore you’re wrong”. But belief X is a belief that would have slowed you down. It would have reduced, not furthered, your chance of success. So of course your full commitment didn’t involve belief X.
My point is that it is often the case that a certain consciously felt belief would increase a person’s chances of success, given their chosen course of action. And in light of what commitment is—it is commitment of one’s self and one’s resources to furthering the probability of success—then if a belief would further a chance of success, then full, really full commitment will include that belief.
So I am not conflating conscious belief with commitment. I am saying that conscious belief can be, and often is, involved in the furthering of success, and therefore can be and often is a part of really full commitment. That is no more conflating belief with commitment than saying that a strong fabric makes a good coat conflates fabric with coats.
You’re right that my analogy was inaccurate: what corresponds in the train-catching scenario to believing there is a predator is my belief that I need to catch this train.
My point is that it is often the case that a certain consciously felt belief would increase a person’s chances of success, given their chosen course of action. And in light of what commitment is—it is commitment of one’s self and one’s resources to furthering the probability of success—then if a belief would further a chance of success, then full, really full commitment will include that belief.
A stronger belief may produce stronger commitment, but strong commitment does not require strong belief. The animal either flees or does not, because a half-hearted sprint will have no effect on the outcome whether a predator is there or not. Similarly, there’s no point making a half-hearted jog for a train, regardless of how much or little one values catching it.
Belief and commitment to act on the belief are two different parts of the process.
Of course, a lot of the “success” literature urges people to have faith in themselves, to believe in their mission, to cast all doubt aside, etc., and if a tool works for someone I’ve no urge to tell them it shouldn’t. But, personally, I take Yoda’s attitude: “Do, or do not.”
Yoda tutors Luke in Jedi philosophy and a practice, which it will take Luke a while to learn. In the meantime, however, Luke is merely an unpolished human. And I am not here recommending a particular philosophy and practice of thought and behavior, but making a prediction about how unpolished humans (and animals) are likely to act. My point is not to recommend that Buridan’s ass should have an exaggerated confidence that the right bucket is closer, but to observe that we can expect him to have an exaggerated confidence, because, for reasons I described, exaggerated confidence is likely to have been selected for because it is likely to have improved the chances of survival of asses who did not have the benefit of Yoda’s instruction.
So I don’t recommend, rather I expect that humans will commonly have conscious feelings of confidence which are exaggerated, and which do not truly reflect the output of the human’s mental black box, his mental machinery to which he does not have access.
Let me explain by the way what I mean here, because I’m saying that the black box can output a 51% probability for Proposition P while at the same time causing the person to be consciously absolutely convinced of the truth of P. This may be confusing, because I seem to be saying that the black box outputs two probabilities, a 51% probability for purposes of decisionmaking and a 100% probability for conscious consumption. So let me explain with an example what I mean.
Suppose you want to test Buridan’s ass to see what probability he assigns to the proposition that the right bucket is closer. What you can do is take the scenario and alter as follows: introduce a mechanism which, with 4% probability, will move the right bucket further than the left bucket before Buridan’s ass gets to it.
Now, if Buridan’s ass assigns a 100% probability that the right bucket is (currently) closer than the left bucket, then taking into account the introduced mechanism, this yields a 96% probability that, by the time the ass gets to it, the right bucket will still be closer to the ass’s starting position. But if Buridan’s ass assigns a 51% probability that the right bucket is (currently) closer than the left bucket, then taking into account the mechanism, this yields approximately a 49% probability (assuming I did the numbers right) that by the time the ass gets to it, the right bucket will be closer.
I am, of course, assuming that the ass is smart enough to understand and incorporate the mechanism into his calculations. Animals have eyes and ears and brains for a reason, so I don’t think it’s a stretch to suppose that there is some way to implement this scenario in a way that an ass really could understand.
So here’s how the test works. You observe that the ass goes to the bucket on the right. You are not sure whether the ass has assigned a 51% probability or a 100% probability to the right bucket being nearer. So you redo the experiment with the added mechanism. If the ass now (with the introduced mechanism) now goes to the bucket on the left, then you can infer that the ass now believes that the probability that the right bucket will be closer by the time he reaches it is less than 50%. But it only changed by a few percentage points as a result of the added mechanism. Therefore he must have assigned only slightly more than 50% probability to it to begin with.
And in this sort of way, you can elicit the ass’s probability assignments.
The ass’s conscious state of mind, however, is something completely separate from this. If we grant the ass the gift of speech, the ass may well say, each time, “there’s not a shred of doubt in my mind that the right bucket is closer”, or “I am entirely confident that the left bucket is closer”.
My point being that we may well be like the ass, and introspective examination of our own conscious state of mind may fail to reveal the actual probabilities that our mental black boxes have assigned to events. It may instead reveal only overconfident delusions that the black box has instilled in the conscious mind for the purpose of encouraging quick action.
I don’t agree with this conflation of commitment and belief. I’ve never had to run from a predator, but when I run to catch a train, I am fully committed to catching the train, although I may be uncertain about whether I will succeed. In fact, the less time I have, the faster I must run, but the less likely I am to catch the train. That only affects my decision to run or not. On making the decision, belief and uncertainty are irrelevant, intention and action are everything.
Maybe some people have to make themselves believe in an outcome they know to be uncertain, in order to achieve it, but that is just a psychological exercise, not a necessary part of action.
The question is not whether there are some examples of commitment which do not involve belief. The question is whether there are (some, many) examples where really, absolutely full commitment does involve belief. I think there are many.
Consider what commitment is. If someone says, “you don’t seem fully committed to this”, what sort of thing might have prompted him to say this? It’s something like, he thinks you aren’t doing everything you could possibly do to help this along. He thinks you are holding back.
You might reply to this criticism, “I am not holding anything back. There is literally nothing more that I can do to further the probability of success, so there is no point in doing more—it would be an empty and possibly counterproductive gesture rather than being an action that truly furthers the chance of success.”
So the important question is, what can a creature do to further the probability of success? Let’s look at you running to catch the train. You claim that believing that you will succeed would not further the success of your effort. Well, of course not! I could have told you that! If you believe that you will succeed, you can become complacent, which runs the risk of slowing you down.
But if you believe that there is something chasing you, that is likely to speed you up.
Your argument is essentially, “my full commitment didn’t involve belief X, therefore you’re wrong”. But belief X is a belief that would have slowed you down. It would have reduced, not furthered, your chance of success. So of course your full commitment didn’t involve belief X.
My point is that it is often the case that a certain consciously felt belief would increase a person’s chances of success, given their chosen course of action. And in light of what commitment is—it is commitment of one’s self and one’s resources to furthering the probability of success—then if a belief would further a chance of success, then full, really full commitment will include that belief.
So I am not conflating conscious belief with commitment. I am saying that conscious belief can be, and often is, involved in the furthering of success, and therefore can be and often is a part of really full commitment. That is no more conflating belief with commitment than saying that a strong fabric makes a good coat conflates fabric with coats.
You’re right that my analogy was inaccurate: what corresponds in the train-catching scenario to believing there is a predator is my belief that I need to catch this train.
A stronger belief may produce stronger commitment, but strong commitment does not require strong belief. The animal either flees or does not, because a half-hearted sprint will have no effect on the outcome whether a predator is there or not. Similarly, there’s no point making a half-hearted jog for a train, regardless of how much or little one values catching it.
Belief and commitment to act on the belief are two different parts of the process.
Of course, a lot of the “success” literature urges people to have faith in themselves, to believe in their mission, to cast all doubt aside, etc., and if a tool works for someone I’ve no urge to tell them it shouldn’t. But, personally, I take Yoda’s attitude: “Do, or do not.”
Yoda tutors Luke in Jedi philosophy and a practice, which it will take Luke a while to learn. In the meantime, however, Luke is merely an unpolished human. And I am not here recommending a particular philosophy and practice of thought and behavior, but making a prediction about how unpolished humans (and animals) are likely to act. My point is not to recommend that Buridan’s ass should have an exaggerated confidence that the right bucket is closer, but to observe that we can expect him to have an exaggerated confidence, because, for reasons I described, exaggerated confidence is likely to have been selected for because it is likely to have improved the chances of survival of asses who did not have the benefit of Yoda’s instruction.
So I don’t recommend, rather I expect that humans will commonly have conscious feelings of confidence which are exaggerated, and which do not truly reflect the output of the human’s mental black box, his mental machinery to which he does not have access.
Let me explain by the way what I mean here, because I’m saying that the black box can output a 51% probability for Proposition P while at the same time causing the person to be consciously absolutely convinced of the truth of P. This may be confusing, because I seem to be saying that the black box outputs two probabilities, a 51% probability for purposes of decisionmaking and a 100% probability for conscious consumption. So let me explain with an example what I mean.
Suppose you want to test Buridan’s ass to see what probability he assigns to the proposition that the right bucket is closer. What you can do is take the scenario and alter as follows: introduce a mechanism which, with 4% probability, will move the right bucket further than the left bucket before Buridan’s ass gets to it.
Now, if Buridan’s ass assigns a 100% probability that the right bucket is (currently) closer than the left bucket, then taking into account the introduced mechanism, this yields a 96% probability that, by the time the ass gets to it, the right bucket will still be closer to the ass’s starting position. But if Buridan’s ass assigns a 51% probability that the right bucket is (currently) closer than the left bucket, then taking into account the mechanism, this yields approximately a 49% probability (assuming I did the numbers right) that by the time the ass gets to it, the right bucket will be closer.
I am, of course, assuming that the ass is smart enough to understand and incorporate the mechanism into his calculations. Animals have eyes and ears and brains for a reason, so I don’t think it’s a stretch to suppose that there is some way to implement this scenario in a way that an ass really could understand.
So here’s how the test works. You observe that the ass goes to the bucket on the right. You are not sure whether the ass has assigned a 51% probability or a 100% probability to the right bucket being nearer. So you redo the experiment with the added mechanism. If the ass now (with the introduced mechanism) now goes to the bucket on the left, then you can infer that the ass now believes that the probability that the right bucket will be closer by the time he reaches it is less than 50%. But it only changed by a few percentage points as a result of the added mechanism. Therefore he must have assigned only slightly more than 50% probability to it to begin with.
And in this sort of way, you can elicit the ass’s probability assignments.
The ass’s conscious state of mind, however, is something completely separate from this. If we grant the ass the gift of speech, the ass may well say, each time, “there’s not a shred of doubt in my mind that the right bucket is closer”, or “I am entirely confident that the left bucket is closer”.
My point being that we may well be like the ass, and introspective examination of our own conscious state of mind may fail to reveal the actual probabilities that our mental black boxes have assigned to events. It may instead reveal only overconfident delusions that the black box has instilled in the conscious mind for the purpose of encouraging quick action.