What would it mean for rationality to be “objectively better”? It depends what the objective is. If your objective is “predictive power,” then by some definitions you are already a rationalist.
Is your issue that predictive power isn’t a good objective, or that there are better methods for prediction than those discussed on this site?
Easy. Predictive power.
It seems like you have strong feelings about rationality without actually knowing what that word means here
If Bay Area rationality is basically correct, it can recognise improved versions of itself.
Corollary: If it isn’t basically correct, it can’t necessarily recognise some better epistemology.
Moral: It takes an epistemology to judge an epistemology.
These are tautologies. What is the point you’re getting at?
“Predictive power” isn’t the answer to everything . For tautologous reasons. (Whatever problem tautologies have, it isn’t lack of truth).
I didn’t say it was the answer to everything. The original phrasing was “more truthful.”
The implication was that you already have an epistemology capable of judging any other.
Is this an epistemology?
Explain why not all humans are rationalists, then. If the paradigm of rationality has more predictive power than their paradigms.
Explain how it feels from inside, for humans, to look at rationality and fail to update, despite rationality being objectively better.
What would it mean for rationality to be “objectively better”? It depends what the objective is. If your objective is “predictive power,” then by some definitions you are already a rationalist.
Is your issue that predictive power isn’t a good objective, or that there are better methods for prediction than those discussed on this site?