There’s something very interesting here. You say your goals are less ambitious and akrasia isn’t a major problem for you. That sounds right. But what exactly does this entail? Your life sounds quite busy and productive. Although your goals might be less ambitious than what others are trying to achieve, I doubt they’re really doing much more work than you, either physically or mentally. This, I think, hints at two possible sources of akrasia: uncertainty and bad goals. Becoming a nurse is something a great number of people have done before, so there’s no great uncertainty surrounding its achievability. In that sense it’s relatively straightforward. We can also be sure it’s a good goal. A nurse is a really existing thing. You’re probably not confused about what a nurse is or what you have to do to become one. I wouldn’t be so sure about, say, developing the first seed AI. I could, in fact, be entirely confused about what this goal is, the concept could be incoherent, the goal may not break down into sub-goals because it’s a mirage and I could end up unable to make progress and unable to explain why. Likewise, the goal is surrounded by uncertainty. Could akrasia be a signal of bad, or at least uncertain, goals?
There’s something very interesting here. You say your goals are less ambitious and akrasia isn’t a major problem for you. That sounds right. But what exactly does this entail? Your life sounds quite busy and productive. Although your goals might be less ambitious than what others are trying to achieve, I doubt they’re really doing much more work than you, either physically or mentally. This, I think, hints at two possible sources of akrasia: uncertainty and bad goals. Becoming a nurse is something a great number of people have done before, so there’s no great uncertainty surrounding its achievability. In that sense it’s relatively straightforward. We can also be sure it’s a good goal. A nurse is a really existing thing. You’re probably not confused about what a nurse is or what you have to do to become one. I wouldn’t be so sure about, say, developing the first seed AI. I could, in fact, be entirely confused about what this goal is, the concept could be incoherent, the goal may not break down into sub-goals because it’s a mirage and I could end up unable to make progress and unable to explain why. Likewise, the goal is surrounded by uncertainty. Could akrasia be a signal of bad, or at least uncertain, goals?