Which is why I said optimizing, not optimized :-) Externalizing the decison-making process and then evaluating it using numbers is a way of optimizing it, not necessarily meaning it is the perfect optimized decision-making process. I decided to share my story right now because I thought the benefits to others in the LW community of me sharing it right now would higher than me having lived in the house for a while and then sharing it. Besides, by the time I live in the house, post-factum justification would be playing a potentially confounding role.
Optimizing assumes that you know you are moving into the right direction.
Eliezer recently wrote on FB in the LW Group:
I hypothesize that the best result will come from making up numbers, multiplying them, and then tossing them out the window and going with what seems like the intuitively best choice afterward.
As far as I understand CFAR also doesn’t advocate doing things that feel very wrong on an intuitive level.
To the extend that you believe that shutting up and calculate is useful, you haven’t provided an argument for why you believe it’s an optimization.
I’m confused by your presumption that I suggested doing things that feel very wrong on an intuitive level. Can you please highlight to me where I stated that? Thanks!
I was using the metaphor Eliezer used here, so I think it might be a semantics issue. The point of using math was to deal with attention bias, as I highlighted above. After that, it’s important to evaluate feelings, for sure.
Without looking at outcomes of a decision-making process it’s hard to know whether it’s optimized.
Which is why I said optimizing, not optimized :-) Externalizing the decison-making process and then evaluating it using numbers is a way of optimizing it, not necessarily meaning it is the perfect optimized decision-making process. I decided to share my story right now because I thought the benefits to others in the LW community of me sharing it right now would higher than me having lived in the house for a while and then sharing it. Besides, by the time I live in the house, post-factum justification would be playing a potentially confounding role.
Optimizing assumes that you know you are moving into the right direction.
Eliezer recently wrote on FB in the LW Group:
As far as I understand CFAR also doesn’t advocate doing things that feel very wrong on an intuitive level. To the extend that you believe that shutting up and calculate is useful, you haven’t provided an argument for why you believe it’s an optimization.
I’m confused by your presumption that I suggested doing things that feel very wrong on an intuitive level. Can you please highlight to me where I stated that? Thanks!
I don’t think you suggested that thing it felt wrong but I think “shut up” suggests letting the data speak for itself and ignoring how it feels like.
I was using the metaphor Eliezer used here, so I think it might be a semantics issue. The point of using math was to deal with attention bias, as I highlighted above. After that, it’s important to evaluate feelings, for sure.