Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
Not to other-optimise, but yes.
As far as I can tell, the chances of encountering a true idea that is also a Lovecraftian cosmic horror is below the vanishing point for human brains. (There aren’t neurons small enough to accurately reflect the tiny chances, etc)
It will also help you make money. Example: I received a promotion for demonstrating my ability to make more efficient rosters. This ability came from googling “scheduling problem” and looking at some common solutions, recognising that GRASP-type (page 7) solutions were effective and probably human-brain-computable—and then when I tried rostering, I intuitively implemented a pseduo-GRASP method.
That “intuitively implemented” bit is really important. You might not realise how much you rely on your intuition to decide for you, but it’s a lot. It sounds like taking a lot of theory and jamming it into your intuition is the hard part for you.
Tangentially, how do you feel about the wisdom of age and the value of experience in making decisions?
I think wisdom and experience are pretty good things—not sure how that relates though.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft. I just mean “if rationality results in extreme misery, I’ll take a pass.”
I think wisdom and experience are pretty good things—not sure how that relates though.
Some people I have encountered struggle with my rationality because I often privilege general laws derived from decision theory and statistics over my own personal experience—like playing tit-for-tat when my gut is screaming defection rock, or participating in mutual fantasising about lottery wins but refusing to buy ‘even one’ lottery ticket. I have found that certain attitudes towards experience and age-wisdom can affect a person’s ability to tag ideas with ‘true in the real world’ - that reason and logic can only achieve ‘true but not actually applicable in the real world’. It was a possibility I thought I should check.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft.
I assumed it was a reference to concepts like Roko’s idea. As for regular extreme misery, yes, there is a case for rationality being negative. You would probably need some irrational beliefs (that you refuse to rationally examine) that prevent you from taking paths where rationality produces misery. You could probably get a half-decent picture of what paths these might be from questioning LessWrong about it, but that only reduces the chance—still a consideration.
Not to other-optimise, but yes.
As far as I can tell, the chances of encountering a true idea that is also a Lovecraftian cosmic horror is below the vanishing point for human brains. (There aren’t neurons small enough to accurately reflect the tiny chances, etc)
It will also help you make money. Example: I received a promotion for demonstrating my ability to make more efficient rosters. This ability came from googling “scheduling problem” and looking at some common solutions, recognising that GRASP-type (page 7) solutions were effective and probably human-brain-computable—and then when I tried rostering, I intuitively implemented a pseduo-GRASP method.
That “intuitively implemented” bit is really important. You might not realise how much you rely on your intuition to decide for you, but it’s a lot. It sounds like taking a lot of theory and jamming it into your intuition is the hard part for you.
Tangentially, how do you feel about the wisdom of age and the value of experience in making decisions?
I think wisdom and experience are pretty good things—not sure how that relates though.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft. I just mean “if rationality results in extreme misery, I’ll take a pass.”
Some people I have encountered struggle with my rationality because I often privilege general laws derived from decision theory and statistics over my own personal experience—like playing tit-for-tat when my gut is screaming defection rock, or participating in mutual fantasising about lottery wins but refusing to buy ‘even one’ lottery ticket. I have found that certain attitudes towards experience and age-wisdom can affect a person’s ability to tag ideas with ‘true in the real world’ - that reason and logic can only achieve ‘true but not actually applicable in the real world’. It was a possibility I thought I should check.
I assumed it was a reference to concepts like Roko’s idea. As for regular extreme misery, yes, there is a case for rationality being negative. You would probably need some irrational beliefs (that you refuse to rationally examine) that prevent you from taking paths where rationality produces misery. You could probably get a half-decent picture of what paths these might be from questioning LessWrong about it, but that only reduces the chance—still a consideration.