Reading Less Wrong made me unable to enjoy debating politics. Now the average online debate seems like a competition who is most stupid. When Facebook shows me a news article with more than 100 comments and I read a few of them, I feel dirty.
My recommended first help would be: think less about stupidity of other people, and more about your own. (Applying my lesson on myself: why am I clicking the “comments” link when I see there are more than 100 comments? And why am I even browsing Facebook in the first place?) If you are so rational, why aren’t you winning more? Yeah, some things in life depend on cooperation of others. But some other things don’t—have you already maximized those? Why not? Did you already clean up your room?
And my point here is not that if you focus on improving yourself, miracles are going to happen just because you read the Sequences. It’s just that focusing on improving yourself has a chance to lead to something useful, unlike complaining about the stupidity of others.
Most people simply don’t care about their sanity. It is a fact about your environment, deal with it. To certain degree, this is about the “near” vs “far” thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote. They survive, because they do not try to connect these two parts; it is as if they live in two completely different universes at the same time.
When you think about incentives, here is the reason: in the “near” mode you are rewarded or punished by the natural consequences of your actions; in the “far” mode you are rewarded or punished by the social consequences of your statements. This it makes sense to act reasonably in your everyday life, and spout exactly the type of crazy bullshit that gets rewarded in given social situation. On average. Sometimes following the socially approved action (using homeopathics for actual illness, or not wearing face mask in COVID-19 situation) gets you killed. But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn’t matter who was actually right.
I kinda see people on a scale, roughly separated into three groups: On one extreme, wannabe rationalists. Those are my tribe. On the other extreme, actively irrational; the kind that not only believes something crazy, but won’t shut up about it. Those I consider hopeless. But between them, and I think it might be the majority of population, is people who kinda try to do their best, sometimes impressively, sometimes their best is not very good; who have some bullshit in their heads because their environment put it there, but they are not actively promoting it, they are merely unable to clean it up; and who are able to see and listen. With those, I need to find the safe set of conversation topics, and remain there most of the time, sometimes gently probe the boundaries. There is this “agree to disagree” bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.
I never try to convert people. I explain, sometimes I nudge. If there is no reaction, I stop.
I am bad at predicting stupid people. I mean, I can vaguely predict that they will most likely “do something stupid”, but it is hard to make specific predictions. People are usually driven by emotions: they defend what they like, and attack what they dislike. They like things that make them feel good, and dislike things that make them feel bad (e.g. being told they are wrong about something). But in real-life situations, multiple forces act upon them at the same time, and I can’t predict which effect will prevail.
My recommended first help would be: think less about stupidity of other people, and more about your own.
This is generally good advice and I do need to be more mindful of my own stupidity, but my problem isn’t that I go searching for other people stupidity so I can get angry at them, more that… I’m getting more and more annoyed every time I accidentally bump into it and I’m trying to avoid reacting by shutting off everything and everyone. Though some of the advice I’m receiving looks helpful about not doing that.
… people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote.
But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn’t matter who was actually right.
I guess that could explain the lack of critical sense they show about stuff they aren’t expert on. I’ve never cared about simply agreeing with other people ideas if they didn’t seemed right to me at first sight, and usually thought I were the one knowing best (even when deeply wrong about it) so that’s not a factor my brain considers when trying to simulate other people. Thank you for this useful insight.
There is this “agree to disagree” bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.
I hadn’t thought of it that way. I was refusing to “agree to disagree” as if it was a moral rule, but I should stick with that if I see no chances I can actually persuade someone. To be more precise, I had figured out that between non rationalists it was often better to agree to disagree since it would be a lost cause, but I thought I just couldn’t do that, no matter who I was talking to.
I’m still a bit queasy about apparently supporting bad epistemology, so I think I’ll try to state it like “We can’t both be right, but I guess talking about it won’t lead us anywhere, so let’s just forget about it”.
“We can’t both be right, but I guess talking about it won’t lead us anywhere, so let’s just forget about it”
Yep. Let’s not fight about it.
I would say that even among rationalists, it may be sometimes useful to settle for: “logically, at least one of us must be wrong… but finding out which one would probably be too costly, and this topic is not that important”.
Ironically I understood the “too costly” logic between rationalists pretty fast, since I’ve witnessed arguments being dissolved or hitting an objectively hard barrier to overcome really fast.
When I’m dealing with non rationalists, instead, I kinda have the impression agreement is just behind the corner.
“I understood your point of view and I have changed mine if I was doing a mistake. If we are still talking it means I figured out what mistake you are doing, why can’t you just understand what I’m saying or tell me the part you aren’t understanding, I’m doing my best to explain and I’ve been honest with you...”
That’s the sensation I usually feel when I care enough to argue about something and just don’t write the effort as hopeless from the start, but it’s just that, what I feel, it’s clearly not easy at all doing all of a sudden what I specifically trained myself to do.
To certain degree, this is about the “near” vs “far” thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote.
When you think about incentives, here is the reason: in the “near” mode you are rewarded or punished by the natural consequences of your actions; in the “far” mode you are rewarded or punished by the social consequences of your statements.
Reading Less Wrong made me unable to enjoy debating politics. Now the average online debate seems like a competition who is most stupid. When Facebook shows me a news article with more than 100 comments and I read a few of them, I feel dirty.
My recommended first help would be: think less about stupidity of other people, and more about your own. (Applying my lesson on myself: why am I clicking the “comments” link when I see there are more than 100 comments? And why am I even browsing Facebook in the first place?) If you are so rational, why aren’t you winning more? Yeah, some things in life depend on cooperation of others. But some other things don’t—have you already maximized those? Why not? Did you already clean up your room?
And my point here is not that if you focus on improving yourself, miracles are going to happen just because you read the Sequences. It’s just that focusing on improving yourself has a chance to lead to something useful, unlike complaining about the stupidity of others.
Most people simply don’t care about their sanity. It is a fact about your environment, deal with it. To certain degree, this is about the “near” vs “far” thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote. They survive, because they do not try to connect these two parts; it is as if they live in two completely different universes at the same time.
When you think about incentives, here is the reason: in the “near” mode you are rewarded or punished by the natural consequences of your actions; in the “far” mode you are rewarded or punished by the social consequences of your statements. This it makes sense to act reasonably in your everyday life, and spout exactly the type of crazy bullshit that gets rewarded in given social situation. On average. Sometimes following the socially approved action (using homeopathics for actual illness, or not wearing face mask in COVID-19 situation) gets you killed. But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn’t matter who was actually right.
I kinda see people on a scale, roughly separated into three groups: On one extreme, wannabe rationalists. Those are my tribe. On the other extreme, actively irrational; the kind that not only believes something crazy, but won’t shut up about it. Those I consider hopeless. But between them, and I think it might be the majority of population, is people who kinda try to do their best, sometimes impressively, sometimes their best is not very good; who have some bullshit in their heads because their environment put it there, but they are not actively promoting it, they are merely unable to clean it up; and who are able to see and listen. With those, I need to find the safe set of conversation topics, and remain there most of the time, sometimes gently probe the boundaries. There is this “agree to disagree” bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.
I never try to convert people. I explain, sometimes I nudge. If there is no reaction, I stop.
I am bad at predicting stupid people. I mean, I can vaguely predict that they will most likely “do something stupid”, but it is hard to make specific predictions. People are usually driven by emotions: they defend what they like, and attack what they dislike. They like things that make them feel good, and dislike things that make them feel bad (e.g. being told they are wrong about something). But in real-life situations, multiple forces act upon them at the same time, and I can’t predict which effect will prevail.
This is generally good advice and I do need to be more mindful of my own stupidity, but my problem isn’t that I go searching for other people stupidity so I can get angry at them, more that… I’m getting more and more annoyed every time I accidentally bump into it and I’m trying to avoid reacting by shutting off everything and everyone. Though some of the advice I’m receiving looks helpful about not doing that.
I guess that could explain the lack of critical sense they show about stuff they aren’t expert on. I’ve never cared about simply agreeing with other people ideas if they didn’t seemed right to me at first sight, and usually thought I were the one knowing best (even when deeply wrong about it) so that’s not a factor my brain considers when trying to simulate other people. Thank you for this useful insight.
I hadn’t thought of it that way. I was refusing to “agree to disagree” as if it was a moral rule, but I should stick with that if I see no chances I can actually persuade someone. To be more precise, I had figured out that between non rationalists it was often better to agree to disagree since it would be a lost cause, but I thought I just couldn’t do that, no matter who I was talking to.
I’m still a bit queasy about apparently supporting bad epistemology, so I think I’ll try to state it like “We can’t both be right, but I guess talking about it won’t lead us anywhere, so let’s just forget about it”.
Yep. Let’s not fight about it.
I would say that even among rationalists, it may be sometimes useful to settle for: “logically, at least one of us must be wrong… but finding out which one would probably be too costly, and this topic is not that important”.
Ironically I understood the “too costly” logic between rationalists pretty fast, since I’ve witnessed arguments being dissolved or hitting an objectively hard barrier to overcome really fast.
When I’m dealing with non rationalists, instead, I kinda have the impression agreement is just behind the corner.
“I understood your point of view and I have changed mine if I was doing a mistake. If we are still talking it means I figured out what mistake you are doing, why can’t you just understand what I’m saying or tell me the part you aren’t understanding, I’m doing my best to explain and I’ve been honest with you...”
That’s the sensation I usually feel when I care enough to argue about something and just don’t write the effort as hopeless from the start, but it’s just that, what I feel, it’s clearly not easy at all doing all of a sudden what I specifically trained myself to do.
This is very good.