While you’re technically correct, I’d say it’s still a little unfair (in the sense of connoting “haha you call yourself a rationalist how come you’re failing at akrasia”).
Two assumptions that can, I think you’ll agree, take away from the force of “akrasia is epistemic failure”:
if modeling and solving akrasia is, like diet, a hard problem that even “experts” barely have an edge on, and importantly, things that do work seem to be very individual-specific making it quite hard to stand on the shoulders of giants
if a large percentage of people who’ve found and read through the sequences etc have done so only because they had very important deadlines to procrastinate
...then on average you’d see akrasia over-represented in rationalists. Add to this the fact that akrasia itself makes manually aiming your rationality skills at what you want harder. That can leave it stable even under very persistent efforts.
While you’re technically correct, I’d say it’s still a little unfair (in the sense of connoting “haha you call yourself a rationalist how come you’re failing at akrasia”).
Two assumptions that can, I think you’ll agree, take away from the force of “akrasia is epistemic failure”:
if modeling and solving akrasia is, like diet, a hard problem that even “experts” barely have an edge on, and importantly, things that do work seem to be very individual-specific making it quite hard to stand on the shoulders of giants
if a large percentage of people who’ve found and read through the sequences etc have done so only because they had very important deadlines to procrastinate
...then on average you’d see akrasia over-represented in rationalists. Add to this the fact that akrasia itself makes manually aiming your rationality skills at what you want harder. That can leave it stable even under very persistent efforts.