As the post hinted, self-deception can give you confidence which is useful in almost all real life situations, from soldier to socialite. Far from “tipping the balance a little bit”, a confidence upgrade is likely to improve your life much more than any amount of rationality training (in the current state of our Art).
Too vague. It’s not clear what is your argument’s denotation, but connotation (becoming overconfident is vastly better than trying to be rational) is a strong and dubious assertion that needs more support to move outside the realm of punditry.
IMO John_Maxwell_IV described the benefits of confidence quite well. For the other side see my post where I explicitly asked people what benefit they derive from the OB/LW Art of Rationality in its current state. Sorry to say, there weren’t many concrete answers. Comments went mostly along the lines of “well, no tangible benefits for me, but truth-seeking is so wonderful in itself”. If you can provide a more convincing answer, please do.
People who debate this often seem to argue for an all-or-nothing approach. I suspect the answer lies somewhere in the middle: be confident if you’re a salesperson but not if you’re a general, for instance. I might look like a member of the “always-be-confident” side to all you extreme epistemic rationalists, but I’m not.
People who debate this often seem to argue for an all-or-nothing approach. I suspect the answer lies somewhere in the middle: be confident if you’re a salesperson but not if you’re a general, for instance.
I think a better conclusion is: be confident if you’re being evaluated by other people, but cautious if you’re being evaluated by reality.
A lot of the confusion here seems to be people with more epistemic than instrumental rationality having difficulty with the idea of deliberately deceiving other people.
But there is another factor: humans are penalized by themselves for doubt. If they (correctly) estimate their ability as low, they may decide not to try at all, and therefore fail to improve. The doubt’s what I’m interested in, not tricking others.
A valid point! However, I think it is the decision to not try that should be counteracted, not the levels of doubt/confidence. That is, cultivate a healthy degree of hubris—figure out what you can probably do, then aim higher, preferably with a plan that allows a safe fallback if you don’t quite make it.
If I could just tell myself to do things and then do them exactly how I told myself, my life would be fucking awesome. Planning isn’t hard. It’s the doing that’s hard.
Someone could (correctly) estimate their ability as low and rationally give it a try anyway, but I think their effort would be significantly lower than someone who knew they could do something.
Edit: I just realized that someone reading the first paragraph might get the idea that I’m morbidly obese or something like that. I don’t have any major problems in my life—just big plans that are mostly unrealized.
You may be correct, and as someone with a persistent procrastination problem I’m in no position to argue with your point.
But still, I am hesitant to accept a blatant hack (actual self-deception) over a more elegant solution (finding a way to expend optimal effort while still having a rational evaluation of the likelihood of success).
For instance, I believe that another LW commenter, pjeby, has written about the issues related to planning vs. doing on his blog.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here. (If I had remembered any of the specific tips, I probably would have included them.)
If you change your mind and decide to take the self-deception route, I recommend this essay and subsequent essays as steps to indoctrinate yourself.
I’m not an epistemical rationalist, I’m an instrumental one. (At least, if I understand those terms correctly.)
That is, I’m interested in maps that help me get places, whether they “accurately” reflect the territory or not. Sometimes, having a too-accurate map—or spending time worrying about how accurate the map is—is detrimental to actually accomplishing anything.
As is probably clear, I am an epistemological rationalist in essence, attempting to understand and cultivate instrumental rationality, because epistemological rationality itself forces me to acknowledge that it alone is insufficient, or even detrimental, to accomplishing my goals.
Reading Less Wrong, and observing the conflicts between epistemological and instrumental rationality, has ironically driven home the point that one of the keys to success is carefully managing controlled self-deception.
I’m not sure yet what the consequences of this will be.
It’s not really self-deception—it’s selective attention. If you’re committed to a course of action, information about possible failure modes is only relevant to the extent that it helps you avoid them. And for the most useful results in life, most failures don’t happen so rapidly that you don’t get any warning, or so catastrophic as to be uncorrectable afterwards.
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant. In the modern era, though, it’s not that common for a minor error to produce severe consequences—you can always start over someplace else with another group of people. So that’s a very good example of an area where more factual information can lead to enhanced confidence.
A major difference between the confident and unconfident is that the unconfident focus on “hard evidence” in the past, while the confident focus on “possibility evidence” in the future. When an optimist says “I can”, it means, “I am able to develop the capability and will eventually succeed if I persist”. Whereas a pessimist may only feel comfortable saying “I can” if they mean, “I have done it before.”
Neither one of them is being “self-deceptive”—they are simply selecting different facts to attend to (or placing them in different contexts), resulting in different emotional and motivational responses. “I haven’t done this before” may well mean excitement and challenge to the optimist, but self-doubt and fear for the pessimist. (See also fixed vs. growth mindsets.)
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
It’s easier than you think, in one sense. The part of you that worries about that stuff is significantly separate from—and to some extent independent of—the part of you that actually makes you do things. It doesn’t matter whether “you” are only 20% certain about the result as long as you convince the doing part that you’re 100% certain you’re going to be doing it.
Doing that merely requires that you 1) actually communicate with the doing part (often a non-trivial learning process for intellectuals such as ourselves), and 2) actually take the time to do the relevant process(es) each time it’s relevant, rather than skipping it because “you already know”.
Number 2, unfortunately, means that akrasia is quasi-recursive. It’s not enough to have a procedure for overcoming it, you must also overcome your inertia against applying that procedure on a regular basis. (Or at least, I have not yet discovered any second-order techniques to get myself or anyone else to consistently apply the first-order techniques… but hmmm… what if I applied a first-order technique to the second-order domain? Hmm.… must conduct experiments...)
It depends on the cost of overconfidence. Nothing ventured, nothing gained. But if the expected cost of venturing wrongly is greater than the expected return, it’s better to be careful what you attempt. If the potential loss is great enough, cautiousness is a virtue. If there’s little investment to lose, cautiousness is a vice.
As the post hinted, self-deception can give you confidence which is useful in almost all real life situations, from soldier to socialite. Far from “tipping the balance a little bit”, a confidence upgrade is likely to improve your life much more than any amount of rationality training (in the current state of our Art).
Too vague. It’s not clear what is your argument’s denotation, but connotation (becoming overconfident is vastly better than trying to be rational) is a strong and dubious assertion that needs more support to move outside the realm of punditry.
IMO John_Maxwell_IV described the benefits of confidence quite well. For the other side see my post where I explicitly asked people what benefit they derive from the OB/LW Art of Rationality in its current state. Sorry to say, there weren’t many concrete answers. Comments went mostly along the lines of “well, no tangible benefits for me, but truth-seeking is so wonderful in itself”. If you can provide a more convincing answer, please do.
People who debate this often seem to argue for an all-or-nothing approach. I suspect the answer lies somewhere in the middle: be confident if you’re a salesperson but not if you’re a general, for instance. I might look like a member of the “always-be-confident” side to all you extreme epistemic rationalists, but I’m not.
I think a better conclusion is: be confident if you’re being evaluated by other people, but cautious if you’re being evaluated by reality.
A lot of the confusion here seems to be people with more epistemic than instrumental rationality having difficulty with the idea of deliberately deceiving other people.
But there is another factor: humans are penalized by themselves for doubt. If they (correctly) estimate their ability as low, they may decide not to try at all, and therefore fail to improve. The doubt’s what I’m interested in, not tricking others.
A valid point! However, I think it is the decision to not try that should be counteracted, not the levels of doubt/confidence. That is, cultivate a healthy degree of hubris—figure out what you can probably do, then aim higher, preferably with a plan that allows a safe fallback if you don’t quite make it.
If I could just tell myself to do things and then do them exactly how I told myself, my life would be fucking awesome. Planning isn’t hard. It’s the doing that’s hard.
Someone could (correctly) estimate their ability as low and rationally give it a try anyway, but I think their effort would be significantly lower than someone who knew they could do something.
Edit: I just realized that someone reading the first paragraph might get the idea that I’m morbidly obese or something like that. I don’t have any major problems in my life—just big plans that are mostly unrealized.
You may be correct, and as someone with a persistent procrastination problem I’m in no position to argue with your point.
But still, I am hesitant to accept a blatant hack (actual self-deception) over a more elegant solution (finding a way to expend optimal effort while still having a rational evaluation of the likelihood of success).
For instance, I believe that another LW commenter, pjeby, has written about the issues related to planning vs. doing on his blog.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here. (If I had remembered any of the specific tips, I probably would have included them.)
If you change your mind and decide to take the self-deception route, I recommend this essay and subsequent essays as steps to indoctrinate yourself.
I’m not an epistemical rationalist, I’m an instrumental one. (At least, if I understand those terms correctly.)
That is, I’m interested in maps that help me get places, whether they “accurately” reflect the territory or not. Sometimes, having a too-accurate map—or spending time worrying about how accurate the map is—is detrimental to actually accomplishing anything.
As is probably clear, I am an epistemological rationalist in essence, attempting to understand and cultivate instrumental rationality, because epistemological rationality itself forces me to acknowledge that it alone is insufficient, or even detrimental, to accomplishing my goals.
Reading Less Wrong, and observing the conflicts between epistemological and instrumental rationality, has ironically driven home the point that one of the keys to success is carefully managing controlled self-deception.
I’m not sure yet what the consequences of this will be.
It’s not really self-deception—it’s selective attention. If you’re committed to a course of action, information about possible failure modes is only relevant to the extent that it helps you avoid them. And for the most useful results in life, most failures don’t happen so rapidly that you don’t get any warning, or so catastrophic as to be uncorrectable afterwards.
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant. In the modern era, though, it’s not that common for a minor error to produce severe consequences—you can always start over someplace else with another group of people. So that’s a very good example of an area where more factual information can lead to enhanced confidence.
A major difference between the confident and unconfident is that the unconfident focus on “hard evidence” in the past, while the confident focus on “possibility evidence” in the future. When an optimist says “I can”, it means, “I am able to develop the capability and will eventually succeed if I persist”. Whereas a pessimist may only feel comfortable saying “I can” if they mean, “I have done it before.”
Neither one of them is being “self-deceptive”—they are simply selecting different facts to attend to (or placing them in different contexts), resulting in different emotional and motivational responses. “I haven’t done this before” may well mean excitement and challenge to the optimist, but self-doubt and fear for the pessimist. (See also fixed vs. growth mindsets.)
I wish I could upmod you twice for this.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
It’s easier than you think, in one sense. The part of you that worries about that stuff is significantly separate from—and to some extent independent of—the part of you that actually makes you do things. It doesn’t matter whether “you” are only 20% certain about the result as long as you convince the doing part that you’re 100% certain you’re going to be doing it.
Doing that merely requires that you 1) actually communicate with the doing part (often a non-trivial learning process for intellectuals such as ourselves), and 2) actually take the time to do the relevant process(es) each time it’s relevant, rather than skipping it because “you already know”.
Number 2, unfortunately, means that akrasia is quasi-recursive. It’s not enough to have a procedure for overcoming it, you must also overcome your inertia against applying that procedure on a regular basis. (Or at least, I have not yet discovered any second-order techniques to get myself or anyone else to consistently apply the first-order techniques… but hmmm… what if I applied a first-order technique to the second-order domain? Hmm.… must conduct experiments...)
An excellent heuristic, indeed!
It depends on the cost of overconfidence. Nothing ventured, nothing gained. But if the expected cost of venturing wrongly is greater than the expected return, it’s better to be careful what you attempt. If the potential loss is great enough, cautiousness is a virtue. If there’s little investment to lose, cautiousness is a vice.
Right.