If I could just tell myself to do things and then do them exactly how I told myself, my life would be fucking awesome. Planning isn’t hard. It’s the doing that’s hard.
Someone could (correctly) estimate their ability as low and rationally give it a try anyway, but I think their effort would be significantly lower than someone who knew they could do something.
Edit: I just realized that someone reading the first paragraph might get the idea that I’m morbidly obese or something like that. I don’t have any major problems in my life—just big plans that are mostly unrealized.
You may be correct, and as someone with a persistent procrastination problem I’m in no position to argue with your point.
But still, I am hesitant to accept a blatant hack (actual self-deception) over a more elegant solution (finding a way to expend optimal effort while still having a rational evaluation of the likelihood of success).
For instance, I believe that another LW commenter, pjeby, has written about the issues related to planning vs. doing on his blog.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here. (If I had remembered any of the specific tips, I probably would have included them.)
If you change your mind and decide to take the self-deception route, I recommend this essay and subsequent essays as steps to indoctrinate yourself.
I’m not an epistemical rationalist, I’m an instrumental one. (At least, if I understand those terms correctly.)
That is, I’m interested in maps that help me get places, whether they “accurately” reflect the territory or not. Sometimes, having a too-accurate map—or spending time worrying about how accurate the map is—is detrimental to actually accomplishing anything.
As is probably clear, I am an epistemological rationalist in essence, attempting to understand and cultivate instrumental rationality, because epistemological rationality itself forces me to acknowledge that it alone is insufficient, or even detrimental, to accomplishing my goals.
Reading Less Wrong, and observing the conflicts between epistemological and instrumental rationality, has ironically driven home the point that one of the keys to success is carefully managing controlled self-deception.
I’m not sure yet what the consequences of this will be.
It’s not really self-deception—it’s selective attention. If you’re committed to a course of action, information about possible failure modes is only relevant to the extent that it helps you avoid them. And for the most useful results in life, most failures don’t happen so rapidly that you don’t get any warning, or so catastrophic as to be uncorrectable afterwards.
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant. In the modern era, though, it’s not that common for a minor error to produce severe consequences—you can always start over someplace else with another group of people. So that’s a very good example of an area where more factual information can lead to enhanced confidence.
A major difference between the confident and unconfident is that the unconfident focus on “hard evidence” in the past, while the confident focus on “possibility evidence” in the future. When an optimist says “I can”, it means, “I am able to develop the capability and will eventually succeed if I persist”. Whereas a pessimist may only feel comfortable saying “I can” if they mean, “I have done it before.”
Neither one of them is being “self-deceptive”—they are simply selecting different facts to attend to (or placing them in different contexts), resulting in different emotional and motivational responses. “I haven’t done this before” may well mean excitement and challenge to the optimist, but self-doubt and fear for the pessimist. (See also fixed vs. growth mindsets.)
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
It’s easier than you think, in one sense. The part of you that worries about that stuff is significantly separate from—and to some extent independent of—the part of you that actually makes you do things. It doesn’t matter whether “you” are only 20% certain about the result as long as you convince the doing part that you’re 100% certain you’re going to be doing it.
Doing that merely requires that you 1) actually communicate with the doing part (often a non-trivial learning process for intellectuals such as ourselves), and 2) actually take the time to do the relevant process(es) each time it’s relevant, rather than skipping it because “you already know”.
Number 2, unfortunately, means that akrasia is quasi-recursive. It’s not enough to have a procedure for overcoming it, you must also overcome your inertia against applying that procedure on a regular basis. (Or at least, I have not yet discovered any second-order techniques to get myself or anyone else to consistently apply the first-order techniques… but hmmm… what if I applied a first-order technique to the second-order domain? Hmm.… must conduct experiments...)
If I could just tell myself to do things and then do them exactly how I told myself, my life would be fucking awesome. Planning isn’t hard. It’s the doing that’s hard.
Someone could (correctly) estimate their ability as low and rationally give it a try anyway, but I think their effort would be significantly lower than someone who knew they could do something.
Edit: I just realized that someone reading the first paragraph might get the idea that I’m morbidly obese or something like that. I don’t have any major problems in my life—just big plans that are mostly unrealized.
You may be correct, and as someone with a persistent procrastination problem I’m in no position to argue with your point.
But still, I am hesitant to accept a blatant hack (actual self-deception) over a more elegant solution (finding a way to expend optimal effort while still having a rational evaluation of the likelihood of success).
For instance, I believe that another LW commenter, pjeby, has written about the issues related to planning vs. doing on his blog.
Yeah, I’ve read some of pjeby’s stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here. (If I had remembered any of the specific tips, I probably would have included them.)
If you change your mind and decide to take the self-deception route, I recommend this essay and subsequent essays as steps to indoctrinate yourself.
I’m not an epistemical rationalist, I’m an instrumental one. (At least, if I understand those terms correctly.)
That is, I’m interested in maps that help me get places, whether they “accurately” reflect the territory or not. Sometimes, having a too-accurate map—or spending time worrying about how accurate the map is—is detrimental to actually accomplishing anything.
As is probably clear, I am an epistemological rationalist in essence, attempting to understand and cultivate instrumental rationality, because epistemological rationality itself forces me to acknowledge that it alone is insufficient, or even detrimental, to accomplishing my goals.
Reading Less Wrong, and observing the conflicts between epistemological and instrumental rationality, has ironically driven home the point that one of the keys to success is carefully managing controlled self-deception.
I’m not sure yet what the consequences of this will be.
It’s not really self-deception—it’s selective attention. If you’re committed to a course of action, information about possible failure modes is only relevant to the extent that it helps you avoid them. And for the most useful results in life, most failures don’t happen so rapidly that you don’t get any warning, or so catastrophic as to be uncorrectable afterwards.
Humans are also biased towards being socially underconfident, because in our historic environment, the consequences of a social gaffe could be significant. In the modern era, though, it’s not that common for a minor error to produce severe consequences—you can always start over someplace else with another group of people. So that’s a very good example of an area where more factual information can lead to enhanced confidence.
A major difference between the confident and unconfident is that the unconfident focus on “hard evidence” in the past, while the confident focus on “possibility evidence” in the future. When an optimist says “I can”, it means, “I am able to develop the capability and will eventually succeed if I persist”. Whereas a pessimist may only feel comfortable saying “I can” if they mean, “I have done it before.”
Neither one of them is being “self-deceptive”—they are simply selecting different facts to attend to (or placing them in different contexts), resulting in different emotional and motivational responses. “I haven’t done this before” may well mean excitement and challenge to the optimist, but self-doubt and fear for the pessimist. (See also fixed vs. growth mindsets.)
I wish I could upmod you twice for this.
Nowhere is it guaranteed that, given the cognitive architecture humans have to work with, epistemic rationality is the easiest instrumentally rational manner to achieve a given goal.
But, personally, I’m still holding out for a way to get from the former to the latter without irrevocable compromises.
It’s easier than you think, in one sense. The part of you that worries about that stuff is significantly separate from—and to some extent independent of—the part of you that actually makes you do things. It doesn’t matter whether “you” are only 20% certain about the result as long as you convince the doing part that you’re 100% certain you’re going to be doing it.
Doing that merely requires that you 1) actually communicate with the doing part (often a non-trivial learning process for intellectuals such as ourselves), and 2) actually take the time to do the relevant process(es) each time it’s relevant, rather than skipping it because “you already know”.
Number 2, unfortunately, means that akrasia is quasi-recursive. It’s not enough to have a procedure for overcoming it, you must also overcome your inertia against applying that procedure on a regular basis. (Or at least, I have not yet discovered any second-order techniques to get myself or anyone else to consistently apply the first-order techniques… but hmmm… what if I applied a first-order technique to the second-order domain? Hmm.… must conduct experiments...)