Is it not useful to avoid the acceptance of false beliefs? To intercept these false beliefs before they can latch on to your mind or the mind of another. In this sense you should practice spotting false beliefs untill it becomes reflexive.
If you are at risk of having fake beliefs latch on to you, then I agree that it is useful to learn about them in order to prevent them from latching on to you. However, I question whether it is common to be at risk of such a thing happening because I can’t think of practical examples of fake beliefs happening to non-rationalists, let alone to rationalists (the example you gave doesn’t seem like a fake belief). The examples of fake beliefs used in Map and Territory seem contrived.
In a way it reminds me of decision theory. My understanding is that expected utility maximization works really well in real life and stuff like Timeless Decision Theory is only needed for contrived examples like Newcomb’s problem.
Is it not useful to avoid the acceptance of false beliefs? To intercept these false beliefs before they can latch on to your mind or the mind of another. In this sense you should practice spotting false beliefs untill it becomes reflexive.
If you are at risk of having fake beliefs latch on to you, then I agree that it is useful to learn about them in order to prevent them from latching on to you. However, I question whether it is common to be at risk of such a thing happening because I can’t think of practical examples of fake beliefs happening to non-rationalists, let alone to rationalists (the example you gave doesn’t seem like a fake belief). The examples of fake beliefs used in Map and Territory seem contrived.
In a way it reminds me of decision theory. My understanding is that expected utility maximization works really well in real life and stuff like Timeless Decision Theory is only needed for contrived examples like Newcomb’s problem.