I agree with you that people who assert crazy beliefs frequently don’t behave in the crazy ways those beliefs would entail.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional). Sometimes that behavior is sane, sometimes it’s crazy, but in neither case does it reflect sanity or insanity as a fundamental attribute.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional)
That may very well be true . . .I’m not sure what it says about rationality testing. If there is a behavior which is conventional but possibly irrational, it might not be so easy to assess its rationality. And if it’s conventional and clearly irrational, how can you tell if a testee engages in it? Probably you cannot trust self-reporting.
A lot of words are getting tossed around here whose meanings I’m not confident I understand. Can you say what it is you want to test for, here, without using the word “rational” or its synonyms? Or can you describe two hypothetical individuals, one of whom you’d expect to pass such a test and the other you’d expect to fail?
Our hypothetical person believes himself to be very good at not letting his emotions and desires color his judgments. However his judgments are heavily informed by these things and then he subconsciously looks for rationalizations to justify them. He is not consciously aware that he does this.
Ideally, he should fail the rationality test.
Conversely, someone who passes the test is someone who correctly believes that his desires and emotions have very little influence over his judgments.
Does that make sense?
And by the way, one of the desires of Person #1 is to appear “rational” to himself and others. So it’s likely he will subconsiously attempt to cheat on any “rationality test. ”
If I were constructing a test to distinguish person #1 from person #2, I would probably ask for them to judge a series of scenarios that were constructed in such a way that formally, the scenarios were identical, but each one had different particulars that related to common emotions and desires, and each scenario was presented in isolation (e.g., via a computer display) so it’s hard to go back and forth and compare.
I would expect P2 to give equivalent answers in each scenario, and P1 not to (though they might try).
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That’s unfortunate, because this strikes me as a very important issue. Even being able to measure one’s own rationality would be very helpful, let alone that of others.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
I’m not sure I would put it in terms of “making decisions” so much as “making judgments,” but basically yes. Also, P1 does make rational judgments in real life but the level of rationality depends on what is at stake.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
Well one idea is to look more directly at what is going on in the brain with some kind of imaging technique. Perhaps self-deception or result-oriented reasoning have a tell tale signature.
Also, perhaps this kind of irrationality is more cognitively demanding. To illustrate, suppose you are having a Socratic dialogue with someone who holds irrational belief X. Instead of simply laying out your argument, you ask the person whether he agrees with Proposition Y, where Proposition Y seems pretty obvious and indisputable. Our rational person might quickly and easily agree or disagree with Y. Whereas our irrational person needs to think more carefully about Y; decide whether it might undermine his position; and if it does, construct a rationalization for rejecting Y. This difference in thinking might be measured in terms of reaction times.
I agree with you that people who assert crazy beliefs frequently don’t behave in the crazy ways those beliefs would entail.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional). Sometimes that behavior is sane, sometimes it’s crazy, but in neither case does it reflect sanity or insanity as a fundamental attribute.
You might find yvain’s discussion of epistemic learned helplessness enjoyable and interesting.
That may very well be true . . .I’m not sure what it says about rationality testing. If there is a behavior which is conventional but possibly irrational, it might not be so easy to assess its rationality. And if it’s conventional and clearly irrational, how can you tell if a testee engages in it? Probably you cannot trust self-reporting.
A lot of words are getting tossed around here whose meanings I’m not confident I understand. Can you say what it is you want to test for, here, without using the word “rational” or its synonyms? Or can you describe two hypothetical individuals, one of whom you’d expect to pass such a test and the other you’d expect to fail?
Our hypothetical person believes himself to be very good at not letting his emotions and desires color his judgments. However his judgments are heavily informed by these things and then he subconsciously looks for rationalizations to justify them. He is not consciously aware that he does this.
Ideally, he should fail the rationality test.
Conversely, someone who passes the test is someone who correctly believes that his desires and emotions have very little influence over his judgments.
Does that make sense?
And by the way, one of the desires of Person #1 is to appear “rational” to himself and others. So it’s likely he will subconsiously attempt to cheat on any “rationality test. ”
Yeah, that helps.
If I were constructing a test to distinguish person #1 from person #2, I would probably ask for them to judge a series of scenarios that were constructed in such a way that formally, the scenarios were identical, but each one had different particulars that related to common emotions and desires, and each scenario was presented in isolation (e.g., via a computer display) so it’s hard to go back and forth and compare.
I would expect P2 to give equivalent answers in each scenario, and P1 not to (though they might try).
I doubt that would work, since P1 most likely has a pretty good standby rationality mode which can be subconsciously invoked if necessary.
But can you give an example of two such formally identical scenarios so I can think about it?
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
That’s unfortunate, because this strikes me as a very important issue. Even being able to measure one’s own rationality would be very helpful, let alone that of others.
I’m not sure I would put it in terms of “making decisions” so much as “making judgments,” but basically yes. Also, P1 does make rational judgments in real life but the level of rationality depends on what is at stake.
Well one idea is to look more directly at what is going on in the brain with some kind of imaging technique. Perhaps self-deception or result-oriented reasoning have a tell tale signature.
Also, perhaps this kind of irrationality is more cognitively demanding. To illustrate, suppose you are having a Socratic dialogue with someone who holds irrational belief X. Instead of simply laying out your argument, you ask the person whether he agrees with Proposition Y, where Proposition Y seems pretty obvious and indisputable. Our rational person might quickly and easily agree or disagree with Y. Whereas our irrational person needs to think more carefully about Y; decide whether it might undermine his position; and if it does, construct a rationalization for rejecting Y. This difference in thinking might be measured in terms of reaction times.
[ha-ha-only-serious](http://www.catb.org/jargon/html/H/ha-ha-only-serious.html)
Rationality is commonly defined as winning. Therefore rationality testing is easy—just check if the subject is a winner or a loser.