A relationship between two rationalists can be much happier and freer of drama. If Eliezer’s example isn’t clear enough, here’s another one.
“I’m worried about X.”
Non-rationalist: “I’ve told you a million times, that’s not gonna happen! Why can’t you trust me?”
Rationalist: “Ok, let’s go to Wikipedia, get some stats, and do the expected value calculation. Let me show you how unlikely this is.”
Which conversation ends in a fight? Which conversation ends in both people actually feeling more at ease?
There are female memes to the effect “Men are endearing fools,” and male memes to the effect “Women are beautiful fools.” But a fool eventually gets frustrating. It is an incredible relief to meet someone who isn’t foolish. “Whoa… you mean you can embrace an idea without being an uncritical fanatic? You mean you can actually make allowances for overconfidence bias, instead of taking reckless gambles? You can listen to the content of what I’m saying instead of the applause lights?” Having a rationalist partner means never having to say “Oh, you wouldn’t understand.”
Also, on cultishness: I saw an ad the other day for a new book on how to start a green activist organization. How to attract members, get speaking engagements, raise money, build momentum, etc. My first reaction was “Oh, that’s nice; I’m sure that book would be handy for environmentalists.” Then I thought “If we did half the stuff that tree-hugging college kids do, we’d call it Dark Arts and we’d be terrified of turning into a cult.”
World-destroying black hole caused by LHC. Autism through vaccination. Cancer from low intensity radio waves (i.e. a cell phone rather than a radar station). A meteorite hitting your house. A plane crashing into your house if you don’t live in a landing vector of an airport. Terrorists capturing the plane you are on if you fly rarely.
How much we should have been worried about a world destroying black hole as an effect of the LHC sounds hard to determine from wikipedia stats. Would you just look at “how often do people say that scientists are going destroy the world, and how often are they right”?
Only the other day, a friend called because she was worried about a possible bad consequence of a mistake she’d made. I immediately agreed that the bad consequence could follow from the mistake. But I went on to point out that it could only happen if three conditions are met, and all three are unlikely, so the probability of the bad consequence is very low.
The result was that she was genuinely reassured. If I had just tried to say “Oh, don’t worry, I’m sure it will be fine”, or tried to argue that it was impossible that it would go wrong, she would have seen that it was not impossible and rejected my reassurance.
I’m trying to turn her onto this site; at the moment she’s pretty explicitly saying she isn’t sure she wouldn’t prefer to hang on to her illusions.
Which conversation ends in a fight? Which conversation ends in both people actually feeling more at ease?
They don’t sound meaningfully different to me; you’re saying the same thing, just less emotively and more casually.
I saw someone recently suggest saying (in a sympathetic tone) “What are you planning to do?”. (Possibly preceded by something like “Yeah, I can understand why you would be”.) I wouldn’t expect good results from it in real life, but I like it anyway (and it might be better than some alternatives).
They’re not the same substance. The first way says “Trust me—I’m upset that you don’t take my word for an answer.” (And the reaction will be “You want me to just smile and nod to everything you say? What gives you the authority?”) The second way says “Ok, let’s see if your fears are justified by checking some objective source.” (And, ideally, the reaction will be “Oh, ok, I didn’t know that. Guess I shouldn’t have worried.” Of course, that depends on the worried partner being fairly rational too; a less rational person might just perceive a status grab and not notice the new information.)
The second way also takes advantage of psychological commitment and consistency. First, you commit to a procedure for determining whether to worry about X, like getting stats from Wikipedia and doing some arithmetic. Only then do you actually do this and find out what the answer is—and by then, no matter what the result, you’ve already made the decision to accept it!
If both participants are rational the second allows the worried party to get real data and execute an update, allowing a real emotional worry to go away. This allows people to have less anxiety about their relationships. This makes relationships with rationalists orders of magnitude better than relationships with people who are merely smart and reasonable.
I don’t think I could go back to dating a nonrationalist.
A relationship between two rationalists can be much happier and freer of drama. If Eliezer’s example isn’t clear enough, here’s another one.
“I’m worried about X.”
Non-rationalist: “I’ve told you a million times, that’s not gonna happen! Why can’t you trust me?”
Rationalist: “Ok, let’s go to Wikipedia, get some stats, and do the expected value calculation. Let me show you how unlikely this is.”
Which conversation ends in a fight? Which conversation ends in both people actually feeling more at ease?
There are female memes to the effect “Men are endearing fools,” and male memes to the effect “Women are beautiful fools.” But a fool eventually gets frustrating. It is an incredible relief to meet someone who isn’t foolish. “Whoa… you mean you can embrace an idea without being an uncritical fanatic? You mean you can actually make allowances for overconfidence bias, instead of taking reckless gambles? You can listen to the content of what I’m saying instead of the applause lights?” Having a rationalist partner means never having to say “Oh, you wouldn’t understand.”
Also, on cultishness: I saw an ad the other day for a new book on how to start a green activist organization. How to attract members, get speaking engagements, raise money, build momentum, etc. My first reaction was “Oh, that’s nice; I’m sure that book would be handy for environmentalists.” Then I thought “If we did half the stuff that tree-hugging college kids do, we’d call it Dark Arts and we’d be terrified of turning into a cult.”
Just curious: what would be a concrete example of an X that would provide for a realistic exchange that fits this pattern?
World-destroying black hole caused by LHC. Autism through vaccination. Cancer from low intensity radio waves (i.e. a cell phone rather than a radar station). A meteorite hitting your house. A plane crashing into your house if you don’t live in a landing vector of an airport. Terrorists capturing the plane you are on if you fly rarely.
Which of these is a major stressor on romantic relationships?
Not that it’s happened to me, but I can easily see “autism through vaccination” fitting into the scenario.
How much we should have been worried about a world destroying black hole as an effect of the LHC sounds hard to determine from wikipedia stats. Would you just look at “how often do people say that scientists are going destroy the world, and how often are they right”?
Only the other day, a friend called because she was worried about a possible bad consequence of a mistake she’d made. I immediately agreed that the bad consequence could follow from the mistake. But I went on to point out that it could only happen if three conditions are met, and all three are unlikely, so the probability of the bad consequence is very low.
The result was that she was genuinely reassured. If I had just tried to say “Oh, don’t worry, I’m sure it will be fine”, or tried to argue that it was impossible that it would go wrong, she would have seen that it was not impossible and rejected my reassurance.
I’m trying to turn her onto this site; at the moment she’s pretty explicitly saying she isn’t sure she wouldn’t prefer to hang on to her illusions.
They don’t sound meaningfully different to me; you’re saying the same thing, just less emotively and more casually.
I saw someone recently suggest saying (in a sympathetic tone) “What are you planning to do?”. (Possibly preceded by something like “Yeah, I can understand why you would be”.) I wouldn’t expect good results from it in real life, but I like it anyway (and it might be better than some alternatives).
They’re not the same substance. The first way says “Trust me—I’m upset that you don’t take my word for an answer.” (And the reaction will be “You want me to just smile and nod to everything you say? What gives you the authority?”) The second way says “Ok, let’s see if your fears are justified by checking some objective source.” (And, ideally, the reaction will be “Oh, ok, I didn’t know that. Guess I shouldn’t have worried.” Of course, that depends on the worried partner being fairly rational too; a less rational person might just perceive a status grab and not notice the new information.)
The second way also takes advantage of psychological commitment and consistency. First, you commit to a procedure for determining whether to worry about X, like getting stats from Wikipedia and doing some arithmetic. Only then do you actually do this and find out what the answer is—and by then, no matter what the result, you’ve already made the decision to accept it!
Definitely a handy technique.
If both participants are rational the second allows the worried party to get real data and execute an update, allowing a real emotional worry to go away. This allows people to have less anxiety about their relationships. This makes relationships with rationalists orders of magnitude better than relationships with people who are merely smart and reasonable.
I don’t think I could go back to dating a nonrationalist.