Even in people whose conscious world models are basically sane (by our exacting LW standards), when they’re considering doing or planning for some weird, uncomfortable, and out-of-the-ordinary action seemingly justified by weighing costs and benefits, it seems to me that akrasia can sometimes be a rational stand-in for some considerations they don’t keep conscious track of, including but not limited to: reputation costs, willpower loss / ego depletion, other limits to worry, the inference that if one weird thing seems especially important now others are going to seem especially important in the future, the possibility that one might waste resources by not following through on a weird plan, the desirability of keeping one’s mind “cleaner” by keeping fewer chunks in one’s planning space, benefits of long-term happiness and of not associating unhappiness with rationality to oneself or to other people, benefits of having one’s actions and motivations make sense to other people, various self-image issues. There are going to be many unmodeled considerations in the other direction too, but I suspect they will normally be fewer.
It’s better to forego a 1-expected-util hare-brained scheme than let it distract you into a 20% chance of foregoing a 10-expected-util hare-brained scheme, or so Morgensternsai tells me.
On the other hand, I don’t want to hand people tools for mediocrity here; many times akrasia in these situations really is just irrational. I wish I had a better idea of when.
Even in people whose conscious world models are basically sane (by our exacting LW standards), when they’re considering doing or planning for some weird, uncomfortable, and out-of-the-ordinary action seemingly justified by weighing costs and benefits, it seems to me that akrasia can sometimes be a rational stand-in for some considerations they don’t keep conscious track of, including but not limited to: reputation costs, willpower loss / ego depletion, other limits to worry, the inference that if one weird thing seems especially important now others are going to seem especially important in the future, the possibility that one might waste resources by not following through on a weird plan, the desirability of keeping one’s mind “cleaner” by keeping fewer chunks in one’s planning space, benefits of long-term happiness and of not associating unhappiness with rationality to oneself or to other people, benefits of having one’s actions and motivations make sense to other people, various self-image issues. There are going to be many unmodeled considerations in the other direction too, but I suspect they will normally be fewer.
It’s better to forego a 1-expected-util hare-brained scheme than let it distract you into a 20% chance of foregoing a 10-expected-util hare-brained scheme, or so Morgensternsai tells me.
On the other hand, I don’t want to hand people tools for mediocrity here; many times akrasia in these situations really is just irrational. I wish I had a better idea of when.