All right, I’d like to attempt a summary to make sure that I am understanding this post, if anyone see’s some mistake in my interpretation, I’d appreciate it if they let me know.
Virtually everyone wants their beliefs to be true, this amounts to practically everyone wants to be epistimically rational. Rationality is a rare trait, so obviously that desire is not enough to make you epistimically rational. But that desire mixed with the rare desire to have all of your beliefs make useful predictions about whatever they talk about, is enough, provided that you never subordinate the mere predictive power of a belief to its truth. If you allow yourself to believe something because you thought it was true, even after you notice that some other belief makes reliably better predictions about whatever target of inquiry, at that point you fail as a rationalist.
Is there anything I’m missing?
What if what I want isn’t true beliefs or to be rational, but to have the best method for finding truths in general, and I use the predictive power of a belief as the best guide to its truth value? In that case, if I find a method that works better than mine, i.e., leads to beliefs with higher predictive power more often than my old method, I’ll switch to that method. I don’t pride myself on having the best method, I pride myself on doing everything I can to find the best method. And, let’s say that finding the best method for finding truth in general is far more important to me than my own life. Is that enough ya think?
It seems a lot like trying to be rational for its own sake, and I know that EY say’s that that leads to an infinite recursion, but I don’t know why the person I described above must be using circular justification.
All right, I’d like to attempt a summary to make sure that I am understanding this post, if anyone see’s some mistake in my interpretation, I’d appreciate it if they let me know.
Virtually everyone wants their beliefs to be true, this amounts to practically everyone wants to be epistimically rational. Rationality is a rare trait, so obviously that desire is not enough to make you epistimically rational. But that desire mixed with the rare desire to have all of your beliefs make useful predictions about whatever they talk about, is enough, provided that you never subordinate the mere predictive power of a belief to its truth. If you allow yourself to believe something because you thought it was true, even after you notice that some other belief makes reliably better predictions about whatever target of inquiry, at that point you fail as a rationalist.
Is there anything I’m missing?
What if what I want isn’t true beliefs or to be rational, but to have the best method for finding truths in general, and I use the predictive power of a belief as the best guide to its truth value? In that case, if I find a method that works better than mine, i.e., leads to beliefs with higher predictive power more often than my old method, I’ll switch to that method. I don’t pride myself on having the best method, I pride myself on doing everything I can to find the best method. And, let’s say that finding the best method for finding truth in general is far more important to me than my own life. Is that enough ya think?
It seems a lot like trying to be rational for its own sake, and I know that EY say’s that that leads to an infinite recursion, but I don’t know why the person I described above must be using circular justification.
please help if you can