That sounds like the correct goal for rationality. I guess the next step for me is to figure out good heuristics for fixing the effects of intentional adjustments, and discovering whether there are cases in real life when I should intentionally believe false things.
It could be a theoretical possibility that never pans out IRL.
Just one more think should it actually turn out to be preferable to believe in false things:
If you start to believe in “beneficial falsehoods” your ability to check the effects of believing additional falsehoods accurately might be reduced. So keep that in mind.
That sounds like the correct goal for rationality. I guess the next step for me is to figure out good heuristics for fixing the effects of intentional adjustments, and discovering whether there are cases in real life when I should intentionally believe false things.
It could be a theoretical possibility that never pans out IRL.
Just one more think should it actually turn out to be preferable to believe in false things:
If you start to believe in “beneficial falsehoods” your ability to check the effects of believing additional falsehoods accurately might be reduced. So keep that in mind.