As I have argued elsewhere I think one big problem here is that while it surely makes sense to desire to believe more true things and less false ones being rational isn’t even a coherent notion.
Of course it could be that really the greek gods run the universe on their whims and the apparent success of physics is merely a fashion trend popular up on mount olympus (hidden from our view). To be meaningful rationality has to exclude simply waking up tomorrow deciding you didn’t like the priors you had the day before and deciding that you assign prior probability 1 to the greek gods on a whim (unless of course you want to say that we are somehow constrained by our past selves from being both rational and evaluating the world afresh no matter how young we were when we first formed our worldview).
This demonstrates that rationality and truth maximization must diverge. Obviously the truth maximizing strategy is to simply dogamtically believe true propositions and dogmatically deny false ones. Moreover, this is clearly a well-defined description of a way to update/form beliefs that obeys probabilistic updating. So either rationality is about luckily picking your dogmatism to be correct or it diverges from the truth maximizing strategy. But if rationality doesn’t make you believe the most true things in the actual world and we lack anything like a coherent principled notion of measure on the set of possible worlds it seems like there isn’t any room left for a non-trivial notion of rationality.
Really less wrong isn’t about being rational since there is no such concept. It’s a social club for people of a certain technical/philosophical/social bent. In other words it’s like a book club for people who want to BS about AI, physics and philosophy and the lack of sexual and social desierability these skills offer men. Frankly I think that’s a much more important goal since I’m more interested in people being happy than rational.
As I have argued elsewhere I think one big problem here is that while it surely makes sense to desire to believe more true things and less false ones being rational isn’t even a coherent notion.
Of course it could be that really the greek gods run the universe on their whims and the apparent success of physics is merely a fashion trend popular up on mount olympus (hidden from our view). To be meaningful rationality has to exclude simply waking up tomorrow deciding you didn’t like the priors you had the day before and deciding that you assign prior probability 1 to the greek gods on a whim (unless of course you want to say that we are somehow constrained by our past selves from being both rational and evaluating the world afresh no matter how young we were when we first formed our worldview).
This demonstrates that rationality and truth maximization must diverge. Obviously the truth maximizing strategy is to simply dogamtically believe true propositions and dogmatically deny false ones. Moreover, this is clearly a well-defined description of a way to update/form beliefs that obeys probabilistic updating. So either rationality is about luckily picking your dogmatism to be correct or it diverges from the truth maximizing strategy. But if rationality doesn’t make you believe the most true things in the actual world and we lack anything like a coherent principled notion of measure on the set of possible worlds it seems like there isn’t any room left for a non-trivial notion of rationality.
Really less wrong isn’t about being rational since there is no such concept. It’s a social club for people of a certain technical/philosophical/social bent. In other words it’s like a book club for people who want to BS about AI, physics and philosophy and the lack of sexual and social desierability these skills offer men. Frankly I think that’s a much more important goal since I’m more interested in people being happy than rational.