Maybe being rational in social situations is the same kind of faux pas as remaining sober at a drinking party.
It has occurred to me yesterday that maybe the typical human irrationality is some kind of a self-handicapping process which could still be a game-theoretical winning move in some situations… and that perhaps many rational people (certainly including me) are missing the social skill to recognize it and act optimally.
The idea came to me when thinking about some smart-but-irrational people who make big money selling some products to irrational people around them. (The products are supposed to make one healthy, there is zero research about them, you can only buy them from a MLM pyramid, and they seem to be rather overpriced.) To me, “making big money” is part of the “winning”, but I realize I could not make money this way simply because I couldn’t find enough irrational people in my social sphere, because I prefer to avoid irrational people. Also I would consider it immoral to make money by selling some nonsense to my friends. But moral arguments aside, let’s supposed that I would start gathering an alternative social circle among irrational people, just to make them my customers for the overpriced irrational products. Just for the sake of experiment. To befriend them to the point where they would trust me about alternative medicine or similar stuff, I would have to convince them that I strongly believe at that stuff, and that I actually study it more deeply than them (which is why they should buy the products from me, instead of using their own reasoning). In other words, I would have to develop an irrational identity. To avoid detection, I should believe in the irrational things… but not too much, to avoid ruining my own life. (My goal is to believe just enough to promote and sell those products convincingly, not to start buying them for myself.) Belief in belief and compartmentalization are useful tools for this purpose, although they can easily get out of the hands… which is why I compare it with alcohol drinking.
With alcohol drinking, there is a risk that you will get drunk and do something really stupid, but if you resist, you get some social benefits. So it is like a costly competition, where people who get drunk but resist the worst effects of alcohol are the winners. Those who refuse to drink are cheaters, and they don’t get the social benefits from winning.
Analogically, the irrationality may be a similar competition in self-handicapping—the goal is to be irrational in the right way, when the losers are irrational in the wrong way. You are a winner if you sell horoscopes, sell UFO movies, sell alternative medicine, sell overpriced MLM products, or simply impress people and get some social benefits. You are a loser if you buy horoscopes, UFO movies, alternative medicine, MLM products, or if you worship your irrational gurus. The goal is to believe, but not too much or in a wrong way. If you are rational, you are a cheater; you don’t win.
In both situation, the game is a net loss for society. The society as a whole would be better without irrationality, just like it would be better without alcoholism. But despite the total losses, there are individual wins… and they keep the game running. I am not sure how big exactly are the wins of being the most resistant alcoholic, but obviously enough to keep the game. The wins of being a successful irrationalist seem thousand times greater, so I don’t expect this game to go away either.
Can you clarify what you mean by this? (My guess is that you’re indulging in some nearsighted consequentialism here.)
With alcohol drinking, there is a risk that you will get drunk and do something really stupid
Doing stupid things while drunk can be fun. You can get good stories out of it, and it can promote bonding (e.g. in the typical stereotype of a college fraternity). Danger can be exciting, and getting really drunk is the easiest way for young people in otherwise comfortable situations to get it.
Edit: I’m uncomfortable with the way you’re tossing around the word “irrational” in this comment. Rationality is about winning. Are the people you’re calling irrational systematically failing to win, or are they just using a different definition of winning than you are? Are you using “rationality” to refer to winning, or are you using it to refer to a collection of cached thoughts / applause lights / tribal signals? (This is directed particularly at “smart-but-irrational people who make big money selling some products to irrational people around them...”)
Are the people you’re calling irrational systematically failing to win, or are they just using a different definition of winning than you are?
Actually, I am not sure. Or more precisely, I am not sure about the proper reference class, and its choice influences the result. As an example, imagine people who believe in homeopathy. Some of them (a minority) are selling homeopathic cures, some of them (a majority) are buying them. Let’s suppose that the only way to be a successful homeopatic seller is to believe that homeopathy works. Do these successful sellers “win” or not? By “winning” let’s assume only the real-world success (money, popularity, etc.), not whether LessWrong would approve their epistemology.
If the reference class is “people who are rich by selling homeopathy”, then yes, they are winning. But this is not a class one can join, just like one cannot join a class of “people who won lottery” without joining the “people who bought lottery tickets” and hoping for a lucky outcome. If we assume that successful homeopatic sellers believe their art, they must first join the “people who believe in homeopathy” group—which I suppose is not winning—and then the lucky ones end up as sellers, and most of them end up as customers.
So my situation is something like feeling envy on seeing that someone won the lottery, and yet not wanting to buy a lottery ticket. (And speculating whether the lottery tickets with the winning numbers could be successfully forged, or how otherwise could the lottery be gamed.)
But the main idea here is that irrational people participate in games that rational people are forbidden from participating in. A social mechanism making sure that those who don’t buy lottery tickets don’t win. You are allowed to sell miracles only if you convince others that you would also buy miracles if you were in a different situation.
And maybe the social mechanism is so strong that participating in the miracle business actually is winning. Not because the miracles work, but because the penalties of being excluded can be even greater than the average losses from believing in the miracles. An extreme example: it is better to lose every Sunday morning and 10% of your income in the church, than to be burned as a heretic. A less extreme example: it is better to have many friends who enjoy homeopathy and crystal healing and whatever, than to have true beliefs and less friends. -- It is difficult to evaluate, because I can’t estimate well either the average costs of believing in homeopathy, or the average costs of social isolation because of not believing. Both of them are probably rather low.
Also, I think that irrational people actually have a good reason to dislike rational people. It’s like a self-defence. If irrational people had no prejudice against the rational people, the rational people could exploit them. Even if I don’t believe in homeopathy, if I would see a willing market, I could be tempted to sell.
If the reference class is “people who are rich by selling homeopathy”, then yes, they are winning. But this is not a class one can join
Why not?
So my situation is something like feeling envy on seeing that someone won the lottery, and yet not wanting to buy a lottery ticket.
You don’t have to get particularly lucky to be around a lot of gullible people.
But the main idea here is that irrational people participate in games that rational people are forbidden from participating in.
Forbidden by what? Again, are you using “rationality” to refer to winning, or are you using it to refer to a collection of cached thoughts / applause lights / tribal signals?
(For what it’s worth, I’m not suggesting that you start selling homeopathic medicine. Even if I thought this was a good way to get rich I wouldn’t do it because I think selling people medicine that doesn’t cure them hurts them, not because it would make me low-status in the rationalist tribe.)
I am using “irrational” as in: believing in fairies, horoscopes, crystal healing, homeopathy, etc. Epistemically wrong beliefs, whether believing in them is profitable or not. (Seems to me that many of those beliefs correlate with each other positively: if a person already reads horoscopes, they are more likely to also believe in crystal healing, etc. Which is why I put them in the same category.)
Whether believing in them is profitable, and how much of that profit can be taken by a person who does not believe, well that’s part of what I am asking. I suspect that selling this kind of a product is much easier for a person who believes. (If you talk with a person who sells these products, and the person who buys these products, both will express similar beliefs: beliefs that the products of this kind do work.) Thus, although believing in these products is epistemically wrong (i.e. they don’t really work as advertised), and is a net loss for an average believer, some people may get big profits from this, and some actually do.
I suspect that believing is necessary for selling. Which is kind of suspicious. Think about this: Would you buy gold (at a favourable price) from a person who believes that gold is worthless? Would you buy homeopathic treatment (at a favourable price) from a person who believes that homeopathy does not work? (Let’s assume that the unbeliever is not a manufacturer, only a distributor.) I suspect that even for a person wholeheartedly believing in homeopathy, the answers are “yes” for the former and “no” for the latter. That expressing belief is necessary for selling; and is more convinging if the person really believes.
Thus I suspect there is some optimal degree of belief, or a right kind of compartmentalization, which leads a person to professing the belief in the products and profiting from selling the product, and yet it does not lead them to (excessive) buying of the products. (For example if I believed that a magical potion increases my intelligence, I would drink one, convince myself and all my friends about its usefulness, sell them hundred potions in an MLM system, and make a profit. But if I really really believed that the magical potion increases my intelligence, I would rather buy hundreds for myself. Which would be a loss, because in reality the magical potion is just an ordinary water with good marketing.)
This level of epistemic wrongness is profitable, but you cannot just put yourself there. You cannot just make yourself believe in something. And if you had a magical wand and could make yourself believe this, you risk becoming a customer, not a dealer.
I suspect that on some level, people with some epistemically wrong beliefs actually know that they are wrong. They can talk all day about how the world will end on December 2012, but they don’t sell their houses and enjoy the money while they can. Perhaps with the horoscopes and homeopathy where the stakes are lower they use heuristics: only buy from a person who believes the same thing as you. Thus if you are wrong and the other person knows it, you are not an exploitable fool. But if you both believe in a product, and it does not really work, then it was just a honest mistake in a good faith.
I suspect that on some level, people with some epistemically wrong beliefs actually know that they are wrong. They can talk all day about how the world will end on December 2012, but they don’t sell their houses and enjoy the money while they can.
As she recited her tale of the primordial cow, with that same strange flaunting pride, she wasn’t even trying to be persuasive—wasn’t even trying to convince us that she took her own religion seriously. [...] It finally occurred to me that this woman wasn’t trying to convince us or even convince herself. Her recitation of the creation story wasn’t about the creation of the world at all. Rather, by launching into a five-minute diatribe about the primordial cow, she was cheering for paganism, like holding up a banner at a football game.
The folks who talked about the world ending in December 2012 weren’t really predicting something, in the way they would say “I believe that loose tire is going to fall off that truck” or “I expect if you make a habit of eating raw cookie dough with eggs in it, you’ll get salmonellosis.” They were expressing affiliation with other people who talk about the world ending in December 2012. They were putting up a banner that says “Hooray for cultural appropriation!” or some such.
Maybe being rational in social situations is the same kind of faux pas as remaining sober at a drinking party.
It has occurred to me yesterday that maybe the typical human irrationality is some kind of a self-handicapping process which could still be a game-theoretical winning move in some situations… and that perhaps many rational people (certainly including me) are missing the social skill to recognize it and act optimally.
The idea came to me when thinking about some smart-but-irrational people who make big money selling some products to irrational people around them. (The products are supposed to make one healthy, there is zero research about them, you can only buy them from a MLM pyramid, and they seem to be rather overpriced.) To me, “making big money” is part of the “winning”, but I realize I could not make money this way simply because I couldn’t find enough irrational people in my social sphere, because I prefer to avoid irrational people. Also I would consider it immoral to make money by selling some nonsense to my friends. But moral arguments aside, let’s supposed that I would start gathering an alternative social circle among irrational people, just to make them my customers for the overpriced irrational products. Just for the sake of experiment. To befriend them to the point where they would trust me about alternative medicine or similar stuff, I would have to convince them that I strongly believe at that stuff, and that I actually study it more deeply than them (which is why they should buy the products from me, instead of using their own reasoning). In other words, I would have to develop an irrational identity. To avoid detection, I should believe in the irrational things… but not too much, to avoid ruining my own life. (My goal is to believe just enough to promote and sell those products convincingly, not to start buying them for myself.) Belief in belief and compartmentalization are useful tools for this purpose, although they can easily get out of the hands… which is why I compare it with alcohol drinking.
With alcohol drinking, there is a risk that you will get drunk and do something really stupid, but if you resist, you get some social benefits. So it is like a costly competition, where people who get drunk but resist the worst effects of alcohol are the winners. Those who refuse to drink are cheaters, and they don’t get the social benefits from winning.
Analogically, the irrationality may be a similar competition in self-handicapping—the goal is to be irrational in the right way, when the losers are irrational in the wrong way. You are a winner if you sell horoscopes, sell UFO movies, sell alternative medicine, sell overpriced MLM products, or simply impress people and get some social benefits. You are a loser if you buy horoscopes, UFO movies, alternative medicine, MLM products, or if you worship your irrational gurus. The goal is to believe, but not too much or in a wrong way. If you are rational, you are a cheater; you don’t win.
In both situation, the game is a net loss for society. The society as a whole would be better without irrationality, just like it would be better without alcoholism. But despite the total losses, there are individual wins… and they keep the game running. I am not sure how big exactly are the wins of being the most resistant alcoholic, but obviously enough to keep the game. The wins of being a successful irrationalist seem thousand times greater, so I don’t expect this game to go away either.
Can you clarify what you mean by this? (My guess is that you’re indulging in some nearsighted consequentialism here.)
Doing stupid things while drunk can be fun. You can get good stories out of it, and it can promote bonding (e.g. in the typical stereotype of a college fraternity). Danger can be exciting, and getting really drunk is the easiest way for young people in otherwise comfortable situations to get it.
Edit: I’m uncomfortable with the way you’re tossing around the word “irrational” in this comment. Rationality is about winning. Are the people you’re calling irrational systematically failing to win, or are they just using a different definition of winning than you are? Are you using “rationality” to refer to winning, or are you using it to refer to a collection of cached thoughts / applause lights / tribal signals? (This is directed particularly at “smart-but-irrational people who make big money selling some products to irrational people around them...”)
Actually, I am not sure. Or more precisely, I am not sure about the proper reference class, and its choice influences the result. As an example, imagine people who believe in homeopathy. Some of them (a minority) are selling homeopathic cures, some of them (a majority) are buying them. Let’s suppose that the only way to be a successful homeopatic seller is to believe that homeopathy works. Do these successful sellers “win” or not? By “winning” let’s assume only the real-world success (money, popularity, etc.), not whether LessWrong would approve their epistemology.
If the reference class is “people who are rich by selling homeopathy”, then yes, they are winning. But this is not a class one can join, just like one cannot join a class of “people who won lottery” without joining the “people who bought lottery tickets” and hoping for a lucky outcome. If we assume that successful homeopatic sellers believe their art, they must first join the “people who believe in homeopathy” group—which I suppose is not winning—and then the lucky ones end up as sellers, and most of them end up as customers.
So my situation is something like feeling envy on seeing that someone won the lottery, and yet not wanting to buy a lottery ticket. (And speculating whether the lottery tickets with the winning numbers could be successfully forged, or how otherwise could the lottery be gamed.)
But the main idea here is that irrational people participate in games that rational people are forbidden from participating in. A social mechanism making sure that those who don’t buy lottery tickets don’t win. You are allowed to sell miracles only if you convince others that you would also buy miracles if you were in a different situation.
And maybe the social mechanism is so strong that participating in the miracle business actually is winning. Not because the miracles work, but because the penalties of being excluded can be even greater than the average losses from believing in the miracles. An extreme example: it is better to lose every Sunday morning and 10% of your income in the church, than to be burned as a heretic. A less extreme example: it is better to have many friends who enjoy homeopathy and crystal healing and whatever, than to have true beliefs and less friends. -- It is difficult to evaluate, because I can’t estimate well either the average costs of believing in homeopathy, or the average costs of social isolation because of not believing. Both of them are probably rather low.
Also, I think that irrational people actually have a good reason to dislike rational people. It’s like a self-defence. If irrational people had no prejudice against the rational people, the rational people could exploit them. Even if I don’t believe in homeopathy, if I would see a willing market, I could be tempted to sell.
Why not?
You don’t have to get particularly lucky to be around a lot of gullible people.
Forbidden by what? Again, are you using “rationality” to refer to winning, or are you using it to refer to a collection of cached thoughts / applause lights / tribal signals?
May I suggest, as an exercise, that you taboo both “rational” and “irrational” for a bit?
(For what it’s worth, I’m not suggesting that you start selling homeopathic medicine. Even if I thought this was a good way to get rich I wouldn’t do it because I think selling people medicine that doesn’t cure them hurts them, not because it would make me low-status in the rationalist tribe.)
I am using “irrational” as in: believing in fairies, horoscopes, crystal healing, homeopathy, etc. Epistemically wrong beliefs, whether believing in them is profitable or not. (Seems to me that many of those beliefs correlate with each other positively: if a person already reads horoscopes, they are more likely to also believe in crystal healing, etc. Which is why I put them in the same category.)
Whether believing in them is profitable, and how much of that profit can be taken by a person who does not believe, well that’s part of what I am asking. I suspect that selling this kind of a product is much easier for a person who believes. (If you talk with a person who sells these products, and the person who buys these products, both will express similar beliefs: beliefs that the products of this kind do work.) Thus, although believing in these products is epistemically wrong (i.e. they don’t really work as advertised), and is a net loss for an average believer, some people may get big profits from this, and some actually do.
I suspect that believing is necessary for selling. Which is kind of suspicious. Think about this: Would you buy gold (at a favourable price) from a person who believes that gold is worthless? Would you buy homeopathic treatment (at a favourable price) from a person who believes that homeopathy does not work? (Let’s assume that the unbeliever is not a manufacturer, only a distributor.) I suspect that even for a person wholeheartedly believing in homeopathy, the answers are “yes” for the former and “no” for the latter. That expressing belief is necessary for selling; and is more convinging if the person really believes.
Thus I suspect there is some optimal degree of belief, or a right kind of compartmentalization, which leads a person to professing the belief in the products and profiting from selling the product, and yet it does not lead them to (excessive) buying of the products. (For example if I believed that a magical potion increases my intelligence, I would drink one, convince myself and all my friends about its usefulness, sell them hundred potions in an MLM system, and make a profit. But if I really really believed that the magical potion increases my intelligence, I would rather buy hundreds for myself. Which would be a loss, because in reality the magical potion is just an ordinary water with good marketing.)
This level of epistemic wrongness is profitable, but you cannot just put yourself there. You cannot just make yourself believe in something. And if you had a magical wand and could make yourself believe this, you risk becoming a customer, not a dealer.
I suspect that on some level, people with some epistemically wrong beliefs actually know that they are wrong. They can talk all day about how the world will end on December 2012, but they don’t sell their houses and enjoy the money while they can. Perhaps with the horoscopes and homeopathy where the stakes are lower they use heuristics: only buy from a person who believes the same thing as you. Thus if you are wrong and the other person knows it, you are not an exploitable fool. But if you both believe in a product, and it does not really work, then it was just a honest mistake in a good faith.
See belief as cheering:
The folks who talked about the world ending in December 2012 weren’t really predicting something, in the way they would say “I believe that loose tire is going to fall off that truck” or “I expect if you make a habit of eating raw cookie dough with eggs in it, you’ll get salmonellosis.” They were expressing affiliation with other people who talk about the world ending in December 2012. They were putting up a banner that says “Hooray for cultural appropriation!” or some such.
I remain sober at alcohol filled parties all the time and do fine.