Most rationalists have lots of problems and believing that the biggest problem is consistently lack of willpower is the sort of blinding optimism I spoke about in the OP. I don’t see why taking incorrect actions wouldn’t make akrasia problems worse, and at best they’re neutral because they’re not advancing anything.
When system I and system II aren’t aligned you get akrasia. When I read your post I imaging that you would classify cases where system II says one shouldn’t engage in an action but system I wants to engages in an action as making an error.
If you allow system II to censor system I from engaging in such actions I expect that you will get less actions.
Most rationalists have lots of problems and believing that the biggest problem is consistently lack of willpower is the sort of blinding optimism I spoke about in the OP. I don’t see why taking incorrect actions wouldn’t make akrasia problems worse, and at best they’re neutral because they’re not advancing anything.
When system I and system II aren’t aligned you get akrasia. When I read your post I imaging that you would classify cases where system II says one shouldn’t engage in an action but system I wants to engages in an action as making an error.
If you allow system II to censor system I from engaging in such actions I expect that you will get less actions.