Hmm, this all roughly makes sense, but I feel like there was some kind of important generator here that you were aiming to convey that I didn’t get.
I think you should probably do most of these things, but not sure which order to do them in, and meanwhile, I think so long as you’re afraid of being unmasked part of the problem seems like it’s about the fear itself?
I think the important generator is: being robust seems like a solution to this “generalized planning fallacy”[1], where you don’t correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can’t tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.
I think you should probably do most of these things, but not sure which order to do them in,
TBC, the main point isn’t that people should do these specific things per se, the main thing is the overall mindset.
and meanwhile, I think so long as you’re afraid of being unmasked part of the problem seems like it’s about the fear itself?
This is what I was getting at with
There are a lot of things to say about the impostor syndrome on a psychological basis (the fears are often unrealistic / unmerited, etc). But I’d like to take another angle.
I think the fear itself is the key problem with impostor syndrome, and I wasn’t trying to say “just be so good you feel secure” should be the main line of attack on that insecurity.
I don’t particularly like this name, but it’s just a temporary handle. In the planning fallacy, you optimistically stress-test your planned schedule. In this case, you optimistically stress-test your life in a broader sense; being robust attempts to counter that.
I think the important generator is: being robust seems like a solution to this “generalized planning fallacy”[1], where you don’t correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can’t tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.
Ah, that does make the point much clearer, thanks!
Awesome. I should also note this generator is post hoc; I was (trying to) do this for a few years before I was even thinking about the planning fallacy.
This reflect both a couple of comments I’ve made regarding rules versus analyzing/optimizing as well as very unclear thought I’ve had bouncing around in my head for a little while now. The though is about the tendency of discussion here being very formal and model oriented, as if we really can optimize in our daily lives like occurs in the theoretical world of our minds and equations and probabilities. I’m not saying that approach is something to avoid, only that it does tent to set the focus on a view of precision that probably does not obtain for most in a majority of situations. (The recent post about acting on intuitions rather than the calculations, then tracking those over a year or two to see what you learn fits here too.)
This rule approach clearly takes that every case decision analysis choice away, if one buys into following the rule. However, we all know that in some cases you can get away with violating the rule (and perhaps even should violate the rule—or revise as has been suggested in other threads/comments as I recall). At the same time if can be difficult, as you mention, to know just which cases are good for violating the general rule. I would add that it might not be enough to keep track of when we violate the rule and what the results were—the probably hangs on just how representative the individual cases are and how well we can tell the differences in any truly complex case.
This seems all very paradoxical to me, and generally I can deal with paradox and a contradictory world (or perhaps just in my own behavior when I try viewing it from an outside perspective). Still, I find myself wanting to wrap my own thinking around the situation be bit better as I read various approaches or thoughts by people here.
Hmm, this all roughly makes sense, but I feel like there was some kind of important generator here that you were aiming to convey that I didn’t get.
I think you should probably do most of these things, but not sure which order to do them in, and meanwhile, I think so long as you’re afraid of being unmasked part of the problem seems like it’s about the fear itself?
I think the important generator is: being robust seems like a solution to this “generalized planning fallacy”[1], where you don’t correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can’t tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.
TBC, the main point isn’t that people should do these specific things per se, the main thing is the overall mindset.
This is what I was getting at with
I think the fear itself is the key problem with impostor syndrome, and I wasn’t trying to say “just be so good you feel secure” should be the main line of attack on that insecurity.
I don’t particularly like this name, but it’s just a temporary handle. In the planning fallacy, you optimistically stress-test your planned schedule. In this case, you optimistically stress-test your life in a broader sense; being robust attempts to counter that.
Ah, that does make the point much clearer, thanks!
Awesome. I should also note this generator is post hoc; I was (trying to) do this for a few years before I was even thinking about the planning fallacy.
This reflect both a couple of comments I’ve made regarding rules versus analyzing/optimizing as well as very unclear thought I’ve had bouncing around in my head for a little while now. The though is about the tendency of discussion here being very formal and model oriented, as if we really can optimize in our daily lives like occurs in the theoretical world of our minds and equations and probabilities. I’m not saying that approach is something to avoid, only that it does tent to set the focus on a view of precision that probably does not obtain for most in a majority of situations. (The recent post about acting on intuitions rather than the calculations, then tracking those over a year or two to see what you learn fits here too.)
This rule approach clearly takes that every case decision analysis choice away, if one buys into following the rule. However, we all know that in some cases you can get away with violating the rule (and perhaps even should violate the rule—or revise as has been suggested in other threads/comments as I recall). At the same time if can be difficult, as you mention, to know just which cases are good for violating the general rule. I would add that it might not be enough to keep track of when we violate the rule and what the results were—the probably hangs on just how representative the individual cases are and how well we can tell the differences in any truly complex case.
This seems all very paradoxical to me, and generally I can deal with paradox and a contradictory world (or perhaps just in my own behavior when I try viewing it from an outside perspective). Still, I find myself wanting to wrap my own thinking around the situation be bit better as I read various approaches or thoughts by people here.