Keep track of when you change your mind about important facts based on new evidence.
a) If you rarely change your mind, you’re probably not rational.
b) If you always change your mind, you’re probably not very smart.
c) If you sometimes change your mind, and sometimes not, I think that’s a pretty good indication that you’re rational.
Of course, I feel that I fall into category (c), which is my own bias. I could test this, if there was a database of how often other people had changed their mind, cross-referenced with IQ.
Here’s some examples from my own past:
I used to completely discount AGW. Now I think it is occuring, but I also think that the negative feedbacks are being ignored/downplayed.
I used to think that the logical economic policy was always the right one. Now, I (begrudgingly) accept that if enough people believe an economic policy is good, it will work, even though it’s not logical. And, concomitantly, a logical economic policy will fail if enough people hate it.
Logic is our fishtank, and we are the fish swimming in it. It is all we know. But there is a possibility that there’s something outside the fishtank, that we are unable to see because of our ideological blinders.
The two great stresses in ancient tribes were A) “having enough to eat” and B) “being large enough to defend the tribe from others”. Those are more or less contradictory goals. But both are incredibly important. People who want to punish rulebreakers and free-riders are generally more inclined to weigh A) over B). People who want to grow the tribe, by being more inclusive and accepting of others are more inclined to weight B) over A).
None of the modern economic theories seem to be any good at handling crises. I used to think that Chicago and Austrian schools had better answers than Keynesians.
I used to think that banks should have just been allowed to die, now I’m not so sure—I see a fair amount of evidence that the logical process there would have caused a significant panic. Not sure either way.
The words are vague enough that I think we’ll usually see ourselves as only sometimes changing our mind. That becomes the new happy medium that we all think we’ve achieved, simply because we’re too ignorant on what it actually means to change your beliefs the right amount that we think.
I’m having a hard time knowing how I could decide if I’m changing my beliefs the right amount; since that would be a (very rough) estimation of an indirect indicator, I feel like I have to disagree with the potential of this idea.
Keep track of when you change your mind about important facts based on new evidence.
a) If you rarely change your mind, you’re probably not rational.
b) If you always change your mind, you’re probably not very smart.
c) If you sometimes change your mind, and sometimes not, I think that’s a pretty good indication that you’re rational.
Of course, I feel that I fall into category (c), which is my own bias. I could test this, if there was a database of how often other people had changed their mind, cross-referenced with IQ.
Here’s some examples from my own past:
I used to completely discount AGW. Now I think it is occuring, but I also think that the negative feedbacks are being ignored/downplayed.
I used to think that the logical economic policy was always the right one. Now, I (begrudgingly) accept that if enough people believe an economic policy is good, it will work, even though it’s not logical. And, concomitantly, a logical economic policy will fail if enough people hate it.
Logic is our fishtank, and we are the fish swimming in it. It is all we know. But there is a possibility that there’s something outside the fishtank, that we are unable to see because of our ideological blinders.
The two great stresses in ancient tribes were A) “having enough to eat” and B) “being large enough to defend the tribe from others”. Those are more or less contradictory goals. But both are incredibly important. People who want to punish rulebreakers and free-riders are generally more inclined to weigh A) over B). People who want to grow the tribe, by being more inclusive and accepting of others are more inclined to weight B) over A).
None of the modern economic theories seem to be any good at handling crises. I used to think that Chicago and Austrian schools had better answers than Keynesians.
I used to think that banks should have just been allowed to die, now I’m not so sure—I see a fair amount of evidence that the logical process there would have caused a significant panic. Not sure either way.
I’m not sure about this.
The words are vague enough that I think we’ll usually see ourselves as only sometimes changing our mind. That becomes the new happy medium that we all think we’ve achieved, simply because we’re too ignorant on what it actually means to change your beliefs the right amount that we think.
I’m having a hard time knowing how I could decide if I’m changing my beliefs the right amount; since that would be a (very rough) estimation of an indirect indicator, I feel like I have to disagree with the potential of this idea.
I agree with many of your points, though the practicality of your test methodology is… well, impractical.
I think rationality itself is one of the ideological blinders you speak of. Forget blinding, it can be totally debilitating.
Irrational morons can be quite successful by any of the usual measures: procreation, monetary wealth, even happiness.
Rationality is simply a point of view. It is satisfying and maybe even fun. But it’s not God. It’s not the “one true way.”
The world would be an awful place to live if everyone was “rational.”