Some of your beliefs can influence the territory while others can’t.
If everyone suddenly stopped to believe that the president of the USA is allowed to command then the president would cease to be powerful.
The map is part of the territory. If you change the map you also change the territory.
For example, scribbling on the map does change the territory if we are talking about the interaction of agents. If you change your strategy then you will also change the strategy of some interacting agents in the territory with respect to yourself.
But the shape of the Earth wouldn’t change if everyone suddenly stopped to believe that it isn’t flat (with a very high probability at least (as long as this isn’t a simulation whose parameters are somehow dependent on what some of us believe ;-)).
Yet if there exists a powerful agent whose actions are dependent on our belief about the shape of Earth then we could influence it by deliberately causing ourselves to believe a falsehood. If doing so would be beneficially then that truth would trump the other.
In conclusion, the ‘Litany of Gendlin’ is too simplistic. A set of beliefs is rational as long as it is in accordance with our utility-function. It is not rational to believe everything that is true, only if doing so maximizes our expected utility.
(Still, there are tricky cases where you could believe one of two incompatible things, in such a way that picking one of them makes it true and the other false. In such cases, you should pick one that, if true, is more preferable than the alternative. Epistemic decisions given by the criterion of correctness are under-determined, in which case one should turn to the overall decision problem, and in some cases it might be better to believe even incorrect things.)
Some of your beliefs can influence the territory while others can’t.
If everyone suddenly stopped to believe that the president of the USA is allowed to command then the president would cease to be powerful.
The map is part of the territory. If you change the map you also change the territory.
For example, scribbling on the map does change the territory if we are talking about the interaction of agents. If you change your strategy then you will also change the strategy of some interacting agents in the territory with respect to yourself.
But the shape of the Earth wouldn’t change if everyone suddenly stopped to believe that it isn’t flat (with a very high probability at least (as long as this isn’t a simulation whose parameters are somehow dependent on what some of us believe ;-)).
Yet if there exists a powerful agent whose actions are dependent on our belief about the shape of Earth then we could influence it by deliberately causing ourselves to believe a falsehood. If doing so would be beneficially then that truth would trump the other.
In conclusion, the ‘Litany of Gendlin’ is too simplistic. A set of beliefs is rational as long as it is in accordance with our utility-function. It is not rational to believe everything that is true, only if doing so maximizes our expected utility.
Upvoted for this.
No: there’s such a thing as epistemic rationality, and it’s the default referent when the phrase “rational belief” is used.
(Still, there are tricky cases where you could believe one of two incompatible things, in such a way that picking one of them makes it true and the other false. In such cases, you should pick one that, if true, is more preferable than the alternative. Epistemic decisions given by the criterion of correctness are under-determined, in which case one should turn to the overall decision problem, and in some cases it might be better to believe even incorrect things.)