Still I think that if we look at the history of the IQ tests, we can take some lessons from there. I mean; imagine that there are no IQ tests yet, and you are supposed to invent the first one. The task would probably seem impossible, and there would be similar objections.
It’s hard to say given that we have the benefit of hindsight, but at least we wouldn’t have to deal with what I believe to be the killer objection—that irrational people would subconsciously cheat if they know they are being tested.
If the first rationality tests will be similarly flawed, that also will not mean the entire field is doomed; later the tests an be improved, the heavily culture-specific questions removed, getting closer to the abstract essence of rationality.
I agree, but that still doesn’t get you any closer to overcoming the problem I described.
I agree there is a risk that an irrational person might have a good model of what would a rational person do (while it is impossible for a stupid person to predict how a smart person would solve a difficult problem). I can imagine a smart religious fanatic thinking: “What would HJPEV, the disgusting little heathen, do in this situation?” and running a rationality routine in a sandbox. In that case, the best we could achieve would be the tests measuring someone’s capacity to think rationally if they choose to.
To my mind that’s not very helpful because the irrational people I meet have been pretty good at thinking rationally if they choose to. Let me illustrate with a hypothetical: Suppose you meet a person with a fervent belief in X, where X is some ridiculous and irrational claim. Instead of trying to convince them that X is wrong, you offer them a bet, the outcome of which is closely tied to whether X is true or not. Generally they will not take the bet. And in general, when you watch them making high or medium stakes decisions, they seem to know perfectly well—at some level—that X is not true.
Of course not all beliefs are capable of being tested in this way, but when they can be tested the phenomenon I described seems pretty much universal. The reasonable inference is that irrational people are generally speaking capable of rational thought. I believe this is known as “standby rationality mode.”
I agree with you that people who assert crazy beliefs frequently don’t behave in the crazy ways those beliefs would entail.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional). Sometimes that behavior is sane, sometimes it’s crazy, but in neither case does it reflect sanity or insanity as a fundamental attribute.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional)
That may very well be true . . .I’m not sure what it says about rationality testing. If there is a behavior which is conventional but possibly irrational, it might not be so easy to assess its rationality. And if it’s conventional and clearly irrational, how can you tell if a testee engages in it? Probably you cannot trust self-reporting.
A lot of words are getting tossed around here whose meanings I’m not confident I understand. Can you say what it is you want to test for, here, without using the word “rational” or its synonyms? Or can you describe two hypothetical individuals, one of whom you’d expect to pass such a test and the other you’d expect to fail?
Our hypothetical person believes himself to be very good at not letting his emotions and desires color his judgments. However his judgments are heavily informed by these things and then he subconsciously looks for rationalizations to justify them. He is not consciously aware that he does this.
Ideally, he should fail the rationality test.
Conversely, someone who passes the test is someone who correctly believes that his desires and emotions have very little influence over his judgments.
Does that make sense?
And by the way, one of the desires of Person #1 is to appear “rational” to himself and others. So it’s likely he will subconsiously attempt to cheat on any “rationality test. ”
If I were constructing a test to distinguish person #1 from person #2, I would probably ask for them to judge a series of scenarios that were constructed in such a way that formally, the scenarios were identical, but each one had different particulars that related to common emotions and desires, and each scenario was presented in isolation (e.g., via a computer display) so it’s hard to go back and forth and compare.
I would expect P2 to give equivalent answers in each scenario, and P1 not to (though they might try).
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That’s unfortunate, because this strikes me as a very important issue. Even being able to measure one’s own rationality would be very helpful, let alone that of others.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
I’m not sure I would put it in terms of “making decisions” so much as “making judgments,” but basically yes. Also, P1 does make rational judgments in real life but the level of rationality depends on what is at stake.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
Well one idea is to look more directly at what is going on in the brain with some kind of imaging technique. Perhaps self-deception or result-oriented reasoning have a tell tale signature.
Also, perhaps this kind of irrationality is more cognitively demanding. To illustrate, suppose you are having a Socratic dialogue with someone who holds irrational belief X. Instead of simply laying out your argument, you ask the person whether he agrees with Proposition Y, where Proposition Y seems pretty obvious and indisputable. Our rational person might quickly and easily agree or disagree with Y. Whereas our irrational person needs to think more carefully about Y; decide whether it might undermine his position; and if it does, construct a rationalization for rejecting Y. This difference in thinking might be measured in terms of reaction times.
Okay, I think there is a decent probability that you are right, but at this moment we need more data, which we will get by trying to create different kinds of rationality tests.
A possible outcome is that we won’t get true rationality tests, but at least something partially useful, e.g. tests selecting the people capable of rational though, which includes a lot of irrational people, but still not everyone. Which may still appear to be just another form of intelligence tests (a sufficiently intelligent irrational person is able to make rational bets, and still believe they have an invisible dragon in the garage).
So… perhaps this is a moment where I should make a bet about my beliefs. Assuming that Stanovich does not give up, and other people will follow him (that is, assuming that enough psychologists will even try to create rationality tests), I’d guess… probability 20% within 5 years, 40% within 10 years, 80% ever (pre-Singularity) that there will be a test which predicts rationality significantly better than an IQ test. Not completely reliably, but sufficiently that you would want your employees to be tested by that test instead of an IQ test, even if you had to pay more for it. (Which doesn’t mean that employers actually will want to use it. Or will be legally allowed to.) And probability 10% within 10 years, 60% ever that a true “rationality test” will be invented, at least for values up to 130 (which still many compartmentalizing people will pass). These numbers are just a wild guess, tomorrow I would probably give different values; I just thought it would be proper to express my beliefs in this format, because it encourages rationality in general.
Which may still appear to be just another form of intelligence tests (
Yes, I have a feeling that “capability of rationality” would be highly correlated with IQ.
Not completely reliably, but sufficiently that you would want your employees to be tested by that test instead of an IQ test
Your mention of employees raises another issue, which is who the test would be aimed at. When we first started discussing the issue, I had an (admittedly vague) idea in my head that the test could be for aspiring rationalists. i.e. that it could it be used to bust irrational lesswrong posters who are far less rational than they realize. It’s arguably more of a challenge to come up with a test to smoke out the self-proclaimed paragon of rationality who has the advantage of careful study and who knows exactly what he is being tested for.
By analogy, consider the Crown-Marlow Social Desirability Scale, which has been described as a test which measures “the respondent’s desire to exaggerate his own moral excellence and to present a socially desirable facade” Here is a sample question from the test:
T F I have never intensely disliked anyone
Probably the test works pretty well for your typical Joe or Jane Sixpack. But someone who is intelligent; who has studied up in this area; and who knows what’s being tested will surely conceal his desire to exaggerate his moral excellence.
That said, having thought about it, I do think there is a decent chance that solid rationality tests will be developed. At least for subjects who are unprepared. One possibility is to measure reaction times as with “Project Implicit.” Perhaps self-deception is more congnitively demanding than self-honesty and therefore a clever test might measure it. But you still might run into the problem of subconscious cheating.
Perhaps self-deception is more congnitively demanding than self-honesty and therefore a clever test might measure it.
If anything, I might expect the opposite to be true in this context. Neurotypical people have fast and frugal conformity heuristics to fall back on, while self-honestly on a lot of questions would probably take some reflection; at least, that’s true for questions that require aggregating information or assessing personality characteristics rather than coming up with a single example of something.
It’d definitely be interesting to hook someone up to a polygraph or EEG and have them take the Crowne-Marlowe Scale, though.
If anything, I might expect the opposite to be true in this context.
Well consider the hypothetical I proposed:
suppose you are having a Socratic dialogue with someone who holds irrational belief X. Instead of simply laying out your argument, you ask the person whether he agrees with Proposition Y, where Proposition Y seems pretty obvious and indisputable. Our rational person might quickly and easily agree or disagree with Y. Whereas our irrational person needs to think more carefully about Y; decide whether it might undermine his position; and if it does, construct a rationalization for rejecting Y. This difference in thinking might be measured in terms of reaction times.
See what I mean?
I do agree that in other contexts, self-deception might require less thought. e.g. spouting off the socially preferable answer to a question without really thinking about what the correct answer is.
It’d definitely be interesting to hook someone up to a polygraph or EEG and have them take the Crowne-Marlowe Scale, though.
That sample question reminds me of a “lie score”, which is a hidden part of some personality tests. Among the serious questions, there are also some questions like this, where you are almost certain that the “nice” answer is a lie. Most people will lie on one or two of ten such question, but the rule of thumb is that if they lie in five or more, you just throw the questionnaire away and declare them a cheater. -- However, if they didn’t lie on any of these question, you do a background check whether they have studied psychology. And you keep in mind that the test score may be manipulated.
Okay, I admit that this problem would be much worse for rationality tests, because if you want a person with given personality, they most likely didn’t study psychology. But if CFAR or similar organizations become very popular, then many candidates for highly rational people will be “tainted” by the explicit study of rationality, simply because studying rationality explicitly is probably a rational thing to do (this is just an assumption), but it’s also what an irrational person self-identifying as a rationalist would do. Also, practicing for IQ tests is obvious cheating, but practicing for getting better at doing rational tasks is the rational thing to do, and a wannabe rationalist would do it, too.
Well, seems like the rationality tests would be more similar to IQ tests than to personality test. Puzzles, time limits… maybe even the reaction times or lie detectors.
Among the serious questions, there are also some questions like this, where you are almost certain that the “nice” answer is a lie.
On the Crowne-Marlowe scale, it looks to me (having found a copy online and taken it) like most of the questions are of this form. When I answered all of the questions honestly, I scored 6, which according to the test, indicates that I am “more willing than most people to respond to tests truthfully”; but what it indicates to me is that, for all but 6 out of 33 questions, the “nice” answer was a lie, at least for me.
The 6 questions were the ones where the answer I gave was, according to the test, the “nice” one, but just happened to be the truth in my case: for example, one of the 6 was “T F I like to gossip at times”; I answered “F”, which is the “nice” answer according to the test—presumably on the assumption that most people do like to gossip but don’t want to admit it—but I genuinely don’t like to gossip at all, and can’t stand talking to people who do. Of course, now you have the problem of deciding whether that statement is true or not. :-)
Could a rationality test be gamed by lying? I think that possibility is inevitable for a test where all you can do is ask the subject questions; you always have the issue of how to know they are answering honestly.
Well, seems like the rationality tests would be more similar to IQ tests than to personality test. Puzzles, time limits… maybe even the reaction times or lie detectors.
Yes, reaction times seem like an interesting possibility. There is an online test for racism which uses this principle. But it would be pretty easy to beat the test if the results counted for anything. Actually lie detectors can be beaten too.
Perhaps brain imaging will eventually advance to the point where you can cheaply and accurately determine if someone is engaged in deception or self-deception :)
It’s hard to say given that we have the benefit of hindsight, but at least we wouldn’t have to deal with what I believe to be the killer objection—that irrational people would subconsciously cheat if they know they are being tested.
I agree, but that still doesn’t get you any closer to overcoming the problem I described.
To my mind that’s not very helpful because the irrational people I meet have been pretty good at thinking rationally if they choose to. Let me illustrate with a hypothetical: Suppose you meet a person with a fervent belief in X, where X is some ridiculous and irrational claim. Instead of trying to convince them that X is wrong, you offer them a bet, the outcome of which is closely tied to whether X is true or not. Generally they will not take the bet. And in general, when you watch them making high or medium stakes decisions, they seem to know perfectly well—at some level—that X is not true.
Of course not all beliefs are capable of being tested in this way, but when they can be tested the phenomenon I described seems pretty much universal. The reasonable inference is that irrational people are generally speaking capable of rational thought. I believe this is known as “standby rationality mode.”
I agree with you that people who assert crazy beliefs frequently don’t behave in the crazy ways those beliefs would entail.
This doesn’t necessarily mean they’re engaging in rational thought.
For one thing, the real world is not that binary. If I assert a crazy belief X, but I behave as though X is not true, it doesn’t follow that my behavior is sane… only that it isn’t crazy in the specific way indicated by X. There are lots of ways to be crazy.
More generally, though… for my own part what I find is that most people’s betting/decision making behavior is neither particularly “rational” nor “irrational” in the way I think you’re using these words, but merely conventional.
That is, I find most people behave the way they’ve seen their peers behaving, regardless of what beliefs they have, let alone what beliefs they assert (asserting beliefs is itself a behavior which is frequently conventional). Sometimes that behavior is sane, sometimes it’s crazy, but in neither case does it reflect sanity or insanity as a fundamental attribute.
You might find yvain’s discussion of epistemic learned helplessness enjoyable and interesting.
That may very well be true . . .I’m not sure what it says about rationality testing. If there is a behavior which is conventional but possibly irrational, it might not be so easy to assess its rationality. And if it’s conventional and clearly irrational, how can you tell if a testee engages in it? Probably you cannot trust self-reporting.
A lot of words are getting tossed around here whose meanings I’m not confident I understand. Can you say what it is you want to test for, here, without using the word “rational” or its synonyms? Or can you describe two hypothetical individuals, one of whom you’d expect to pass such a test and the other you’d expect to fail?
Our hypothetical person believes himself to be very good at not letting his emotions and desires color his judgments. However his judgments are heavily informed by these things and then he subconsciously looks for rationalizations to justify them. He is not consciously aware that he does this.
Ideally, he should fail the rationality test.
Conversely, someone who passes the test is someone who correctly believes that his desires and emotions have very little influence over his judgments.
Does that make sense?
And by the way, one of the desires of Person #1 is to appear “rational” to himself and others. So it’s likely he will subconsiously attempt to cheat on any “rationality test. ”
Yeah, that helps.
If I were constructing a test to distinguish person #1 from person #2, I would probably ask for them to judge a series of scenarios that were constructed in such a way that formally, the scenarios were identical, but each one had different particulars that related to common emotions and desires, and each scenario was presented in isolation (e.g., via a computer display) so it’s hard to go back and forth and compare.
I would expect P2 to give equivalent answers in each scenario, and P1 not to (though they might try).
I doubt that would work, since P1 most likely has a pretty good standby rationality mode which can be subconsciously invoked if necessary.
But can you give an example of two such formally identical scenarios so I can think about it?
It’s a fair question, but I don’t have a good example to give you, and constructing one would take more effort than I feel like putting into it. So, no, sorry.
That said, what you seem to be saying is that P2 is capable of making decisions that aren’t influenced by their emotions and desires (via “standby rationality mode”) but does not in fact do so except when taking rationality tests, whereas P1 is capable of it and also does so in real life.
If I’ve understood that correctly, then I agree that no rationality test can distinguish P1 and P2′s ability to make decisions that aren’t influenced by their emotions and desires.
That’s unfortunate, because this strikes me as a very important issue. Even being able to measure one’s own rationality would be very helpful, let alone that of others.
I’m not sure I would put it in terms of “making decisions” so much as “making judgments,” but basically yes. Also, P1 does make rational judgments in real life but the level of rationality depends on what is at stake.
Well one idea is to look more directly at what is going on in the brain with some kind of imaging technique. Perhaps self-deception or result-oriented reasoning have a tell tale signature.
Also, perhaps this kind of irrationality is more cognitively demanding. To illustrate, suppose you are having a Socratic dialogue with someone who holds irrational belief X. Instead of simply laying out your argument, you ask the person whether he agrees with Proposition Y, where Proposition Y seems pretty obvious and indisputable. Our rational person might quickly and easily agree or disagree with Y. Whereas our irrational person needs to think more carefully about Y; decide whether it might undermine his position; and if it does, construct a rationalization for rejecting Y. This difference in thinking might be measured in terms of reaction times.
[ha-ha-only-serious](http://www.catb.org/jargon/html/H/ha-ha-only-serious.html)
Rationality is commonly defined as winning. Therefore rationality testing is easy—just check if the subject is a winner or a loser.
Okay, I think there is a decent probability that you are right, but at this moment we need more data, which we will get by trying to create different kinds of rationality tests.
A possible outcome is that we won’t get true rationality tests, but at least something partially useful, e.g. tests selecting the people capable of rational though, which includes a lot of irrational people, but still not everyone. Which may still appear to be just another form of intelligence tests (a sufficiently intelligent irrational person is able to make rational bets, and still believe they have an invisible dragon in the garage).
So… perhaps this is a moment where I should make a bet about my beliefs. Assuming that Stanovich does not give up, and other people will follow him (that is, assuming that enough psychologists will even try to create rationality tests), I’d guess… probability 20% within 5 years, 40% within 10 years, 80% ever (pre-Singularity) that there will be a test which predicts rationality significantly better than an IQ test. Not completely reliably, but sufficiently that you would want your employees to be tested by that test instead of an IQ test, even if you had to pay more for it. (Which doesn’t mean that employers actually will want to use it. Or will be legally allowed to.) And probability 10% within 10 years, 60% ever that a true “rationality test” will be invented, at least for values up to 130 (which still many compartmentalizing people will pass). These numbers are just a wild guess, tomorrow I would probably give different values; I just thought it would be proper to express my beliefs in this format, because it encourages rationality in general.
Yes, I have a feeling that “capability of rationality” would be highly correlated with IQ.
Your mention of employees raises another issue, which is who the test would be aimed at. When we first started discussing the issue, I had an (admittedly vague) idea in my head that the test could be for aspiring rationalists. i.e. that it could it be used to bust irrational lesswrong posters who are far less rational than they realize. It’s arguably more of a challenge to come up with a test to smoke out the self-proclaimed paragon of rationality who has the advantage of careful study and who knows exactly what he is being tested for.
By analogy, consider the Crown-Marlow Social Desirability Scale, which has been described as a test which measures “the respondent’s desire to exaggerate his own moral excellence and to present a socially desirable facade” Here is a sample question from the test:
Probably the test works pretty well for your typical Joe or Jane Sixpack. But someone who is intelligent; who has studied up in this area; and who knows what’s being tested will surely conceal his desire to exaggerate his moral excellence.
That said, having thought about it, I do think there is a decent chance that solid rationality tests will be developed. At least for subjects who are unprepared. One possibility is to measure reaction times as with “Project Implicit.” Perhaps self-deception is more congnitively demanding than self-honesty and therefore a clever test might measure it. But you still might run into the problem of subconscious cheating.
If anything, I might expect the opposite to be true in this context. Neurotypical people have fast and frugal conformity heuristics to fall back on, while self-honestly on a lot of questions would probably take some reflection; at least, that’s true for questions that require aggregating information or assessing personality characteristics rather than coming up with a single example of something.
It’d definitely be interesting to hook someone up to a polygraph or EEG and have them take the Crowne-Marlowe Scale, though.
Well consider the hypothetical I proposed:
See what I mean?
I do agree that in other contexts, self-deception might require less thought. e.g. spouting off the socially preferable answer to a question without really thinking about what the correct answer is.
Yes.
That sample question reminds me of a “lie score”, which is a hidden part of some personality tests. Among the serious questions, there are also some questions like this, where you are almost certain that the “nice” answer is a lie. Most people will lie on one or two of ten such question, but the rule of thumb is that if they lie in five or more, you just throw the questionnaire away and declare them a cheater. -- However, if they didn’t lie on any of these question, you do a background check whether they have studied psychology. And you keep in mind that the test score may be manipulated.
Okay, I admit that this problem would be much worse for rationality tests, because if you want a person with given personality, they most likely didn’t study psychology. But if CFAR or similar organizations become very popular, then many candidates for highly rational people will be “tainted” by the explicit study of rationality, simply because studying rationality explicitly is probably a rational thing to do (this is just an assumption), but it’s also what an irrational person self-identifying as a rationalist would do. Also, practicing for IQ tests is obvious cheating, but practicing for getting better at doing rational tasks is the rational thing to do, and a wannabe rationalist would do it, too.
Well, seems like the rationality tests would be more similar to IQ tests than to personality test. Puzzles, time limits… maybe even the reaction times or lie detectors.
On the Crowne-Marlowe scale, it looks to me (having found a copy online and taken it) like most of the questions are of this form. When I answered all of the questions honestly, I scored 6, which according to the test, indicates that I am “more willing than most people to respond to tests truthfully”; but what it indicates to me is that, for all but 6 out of 33 questions, the “nice” answer was a lie, at least for me.
The 6 questions were the ones where the answer I gave was, according to the test, the “nice” one, but just happened to be the truth in my case: for example, one of the 6 was “T F I like to gossip at times”; I answered “F”, which is the “nice” answer according to the test—presumably on the assumption that most people do like to gossip but don’t want to admit it—but I genuinely don’t like to gossip at all, and can’t stand talking to people who do. Of course, now you have the problem of deciding whether that statement is true or not. :-)
Could a rationality test be gamed by lying? I think that possibility is inevitable for a test where all you can do is ask the subject questions; you always have the issue of how to know they are answering honestly.
Yes, reaction times seem like an interesting possibility. There is an online test for racism which uses this principle. But it would be pretty easy to beat the test if the results counted for anything. Actually lie detectors can be beaten too.
Perhaps brain imaging will eventually advance to the point where you can cheaply and accurately determine if someone is engaged in deception or self-deception :)