If you are at risk of having fake beliefs latch on to you, then I agree that it is useful to learn about them in order to prevent them from latching on to you. However, I question whether it is common to be at risk of such a thing happening because I can’t think of practical examples of fake beliefs happening to non-rationalists, let alone to rationalists (the example you gave doesn’t seem like a fake belief). The examples of fake beliefs used in Map and Territory seem contrived.
In a way it reminds me of decision theory. My understanding is that expected utility maximization works really well in real life and stuff like Timeless Decision Theory is only needed for contrived examples like Newcomb’s problem.
If you are at risk of having fake beliefs latch on to you, then I agree that it is useful to learn about them in order to prevent them from latching on to you. However, I question whether it is common to be at risk of such a thing happening because I can’t think of practical examples of fake beliefs happening to non-rationalists, let alone to rationalists (the example you gave doesn’t seem like a fake belief). The examples of fake beliefs used in Map and Territory seem contrived.
In a way it reminds me of decision theory. My understanding is that expected utility maximization works really well in real life and stuff like Timeless Decision Theory is only needed for contrived examples like Newcomb’s problem.