The top item in my to-do list reads: “If confused, make list! If confusion persists, make lists for lists!”
Point being, I think taskifying in order to avoid counting difficult, unpleasant tasks as one item is useful because it better mirrors reality. For (very ground-level) instance, eating enough meals in a day is hard for me to do consistently because “eat a meal” has a ton of steps: decide what to eat, find ingredients, assemble, and so on. So if I lie to myself and say it’s only one step, I feel bad about being so stupid for having trouble with Just One Step, and subsequently don’t do anything because I’m in an Ugh Field. If I acknowledge that if I am having trouble accomplishing something, that means it has multiple steps… well, I still do less than my fictional idealized self would do, but I still do more than otherwise.
I find that a lot of my friends have trouble grokking this because the rationalist/perfectionist ideacluster is heavily grouped. For some reason it’s hard to think about what a perfect rational agent would do without, at least somewhat and unconsciously, comparing oneself to that agent.
The top item in my to-do list reads: “If confused, make list! If confusion persists, make lists for lists!”
Point being, I think taskifying in order to avoid counting difficult, unpleasant tasks as one item is useful because it better mirrors reality. For (very ground-level) instance, eating enough meals in a day is hard for me to do consistently because “eat a meal” has a ton of steps: decide what to eat, find ingredients, assemble, and so on. So if I lie to myself and say it’s only one step, I feel bad about being so stupid for having trouble with Just One Step, and subsequently don’t do anything because I’m in an Ugh Field. If I acknowledge that if I am having trouble accomplishing something, that means it has multiple steps… well, I still do less than my fictional idealized self would do, but I still do more than otherwise.
I find that a lot of my friends have trouble grokking this because the rationalist/perfectionist ideacluster is heavily grouped. For some reason it’s hard to think about what a perfect rational agent would do without, at least somewhat and unconsciously, comparing oneself to that agent.