Isn’t the objective of rationality to correctly align our beliefs with reality, so that they may pay rent when we try to achieve our goals?
Protecting oneself against manipulation, learning to argue correctly and getting used to being defeated are all byproducts of the fact that there is only one reality, independent of the mind.
I think that we can take something clear and simple from the posts below: rationality should not only help you to accomplish your goals, but also to define goals clearly and identify easy and (more importantly) useful goals that are likely to induce a prolonged (preferably indefinitely so) period of well being.
Can we at least agree that these three imperatives
Believe true things
Achieve goals
Induce well-being
are not identical? There seems to a be “rationality thesis” here that the best way to go about 2. and 3. is to sort out 1. first. I would like to see this thesis stated more clearly.
This may very well be the case today, or in our society, but it’s not really difficult to imagine a society in which you have to ‘hold’ really crazy idea in order to win.
Also, believing true things is an endeavour which is never completed per se: it surely is not possible to have it sorted out simpliciter before attaining 2 (the third imperative I really see as a subgoal of the second one).
The thesis after all conflicts with basically all history of humanity: homo sapiens has won more and more without attaining a perfect accuracy. However it seems to me that it had won more where it accumulated a greater amount of truths.
So I won’t really say that in order to win you have to be accurate, but I think a strong case can be made that accuracy enhances the probability of winning.
What is then the real purpose of rationality? I’m perfectly fine if we accept the conjunction “truth /\ winning”, with the provision that
P(winning | high degree of truth) > P(winning | low degree of truth). However, if Omega is going to pop-up and ask:
You must choose between two alternatives. I can give you the real TOE and
remove your cognitive bias if you accept to live a miserable life, or you can live a
very comfortable and satisfying existence, provided that you let me implant the
belief in the flying spaghetti monster.
Isn’t the objective of rationality to correctly align our beliefs with reality, so that they may pay rent when we try to achieve our goals?
Protecting oneself against manipulation, learning to argue correctly and getting used to being defeated are all byproducts of the fact that there is only one reality, independent of the mind.
I like this formulation
by itself.
I think that we can take something clear and simple from the posts below: rationality should not only help you to accomplish your goals, but also to define goals clearly and identify easy and (more importantly) useful goals that are likely to induce a prolonged (preferably indefinitely so) period of well being.
Can we at least agree that these three imperatives
Believe true things
Achieve goals
Induce well-being
are not identical? There seems to a be “rationality thesis” here that the best way to go about 2. and 3. is to sort out 1. first. I would like to see this thesis stated more clearly.
This may very well be the case today, or in our society, but it’s not really difficult to imagine a society in which you have to ‘hold’ really crazy idea in order to win. Also, believing true things is an endeavour which is never completed per se: it surely is not possible to have it sorted out simpliciter before attaining 2 (the third imperative I really see as a subgoal of the second one).
The thesis after all conflicts with basically all history of humanity: homo sapiens has won more and more without attaining a perfect accuracy. However it seems to me that it had won more where it accumulated a greater amount of truths.
So I won’t really say that in order to win you have to be accurate, but I think a strong case can be made that accuracy enhances the probability of winning.
What is then the real purpose of rationality? I’m perfectly fine if we accept the conjunction “truth /\ winning”, with the provision that P(winning | high degree of truth) > P(winning | low degree of truth). However, if Omega is going to pop-up and ask:
I confess I would guiltily choose the second.