Yes there is. I gave examples that were salient to me, which I had a lot of knowledge about.
And my audience was LessWrong, which I thought could handle the examples like mature adults.
But my main takeaway was flak from people telling me that an evidence repository is unnecessary because “true claims sound better” and, more popularly, that my ideas are “suspicious”―not with any allegation that I said anything untrue*, or that my plan wouldn’t work, or that the evidence I supplied was insufficient or unpersuasive, or that I violated any rationalist virtue whatsoever, but simply because the evidence was “political”.
If you know of some non-political examples which have had as much impact on the modern world as the epistemic errors involved in global warming policy and the invasion of Ukraine, by all means tell me. I beg you. And if not, how did you expect me to make the point that untrue beliefs have large negative global impacts? But never mind; I’m certain you gave no thought to the matter. It feels like you’re just here to tear people down day after day, month after month, year after year. Does it make you feel good? What drives you?
Edit: admittedly that’s not very plausible as a motive, but here’s something that fits better. Knowing about biases can hurt people, but there’s no reason this would be limited only to knowledge about biases. You discovered that there’s no need to use rationalist principles for truthseeking; you can use them instead as spitballs to toss at people―and then leave the room before any disagreements are resolved. Your purpose here, then, is target practice. You play LessWrong the way others play PUBG. And there may be many spitballers here, you’re just more prolific.
* except this guy, whom I thank for illustrating my point that existing forums are unsuitable for reasonably arbitrating factual disagreements.
And my audience was LessWrong, which I thought could handle the examples like mature adults.
Part of rationality is not being in denial of reality and there are certain realities about what happens when you talk about politics.
Part of what the sequences are about is to care about reality and you prefer to be in denial of it and ignore the advice that the sequences made. Bringing up that you ignored it felt relevant to me.
I’m certain you gave no thought to the matter. It feels like you’re just here to tear people down day after day, month after month, year after year.
Then you are wrong. Contemporary politics is one source of examples but a very bad source as described in the sequences. There’s history. In the West, we generally study history to learn from it.
If you know of some non-political examples which have had as much impact on the modern world as the epistemic errors involved in global warming policy and the invasion of Ukraine, by all means tell me.
The decision to invade Iraq in more recent history was driven by bad epistemics. Talking about it does not trigger people’s tribal senses the same way as talking about contemporary political conflicts.
If you go further back, there are also plenty of things that happened in the 20th century that were driven by bad epistemics.
Lastly, there’s no reason why you have to pick the most consequential examples to make your points. You don’t want people to focus on big consequences but want them to focus on the dynamics of truthseeking.
there are certain realities about what happens when you talk about politics.
Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it’s just sad that the topic you are correct about is the LessWrong community.
Part of what the sequences are about is to care about reality and you prefer to be in denial of it
How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed. And like any good rationalist, I’ve updated on this surprising new evidence. (I still think many are not, but navigating such diversity is very challenging.)
Then you are wrong.
Almost every interaction I’ve ever had with you has been unpleasant. I’ve had plenty of pleasant interactions, so I’m confident about which one of us this is a property of, and you can imagine how much I believe you. Besides which, it’s implausible that you remember your thought processes in each of the hundred-ish comments you’ve made in the last year. For me to be wrong means you recollected the thought process that went into a one-sentence snipe, as in “oh yeah I remember that comment, that’s the one where I did think about what he was trying to communicate and how he could have done better, but I was busy that day and had to leave a one-sentence snipe instead.”
Talking about it does not trigger people’s tribal senses the same way as talking about contemporary political conflicts.
Odd but true. Good point.
there are also plenty of things that happened in the 20th century that were driven by bad epistemics
No doubt, and there might even be many that are clear-cut and no longer political for most people. But there are no such events I am knowledgeable about.
You don’t want people to focus on big consequences
Yes, I do. I want people to sense the big consequences, deeply and viscerally, in order to generate motivation. Still, a more academic reformulation may also be valuable.
How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed.
That’s easily solved by reading the post from Eliezer about Politics is the Mind-Killer and understanding the advice it makes.
For me to be wrong means you recollected the thought process that went into a one-sentence snipe
If I had only voiced that position in a comment and nowhere else, that might be true. That’s not the case, I have multiple times criticized people for not applying Politics is the Mind-Killer and using political examples to make points that aren’t about politics.
Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it’s just sad that the topic you are correct about is the LessWrong community.
I talk about politics when I want to make a point about politics. I usually don’t talk about politics when I want to make a point about something else. If you want to make a point about politics it’s unavoidable to talk about politics.
The advice of Politics is the Mind-Killer is that you don’t talk about politics if you want to make a point that isn’t about politics because using political examples makes it harder for the point that isn’t about politics to come through. That post does not advise people not to talk about politics even if people who haven’t read it sometimes use the title as a catchphrase for the position that one shouldn’t talk about politics in general.
It’s a useful heuristic that Eliezer proposed. You write a post titled “Let’s make the truth easier to find” and in it follow heuristics that make the truth harder to find. If your actual goal would be to “Let’s make the truth easier to find” then my feedback would be valuable. Of course, if your goal is to signal that you care about the truth and have certain political positions, then my feedback feels offensive.
Politics is the Mind-Killer there’s no good reason to lead with examples that are this political.
Yes there is. I gave examples that were salient to me, which I had a lot of knowledge about.
And my audience was LessWrong, which I thought could handle the examples like mature adults.
But my main takeaway was flak from people telling me that an evidence repository is unnecessary because “true claims sound better” and, more popularly, that my ideas are “suspicious”―not with any allegation that I said anything untrue*, or that my plan wouldn’t work, or that the evidence I supplied was insufficient or unpersuasive, or that I violated any rationalist virtue whatsoever, but simply because the evidence was “political”.
If you know of some non-political examples which have had as much impact on the modern world as the epistemic errors involved in global warming policy and the invasion of Ukraine, by all means tell me. I beg you. And if not, how did you expect me to make the point that untrue beliefs have large negative global impacts? But never mind; I’m certain you gave no thought to the matter. It feels like you’re just here to tear people down day after day, month after month, year after year. Does it make you feel good? What drives you?
Edit: admittedly that’s not very plausible as a motive, but here’s something that fits better. Knowing about biases can hurt people, but there’s no reason this would be limited only to knowledge about biases. You discovered that there’s no need to use rationalist principles for truthseeking; you can use them instead as spitballs to toss at people―and then leave the room before any disagreements are resolved. Your purpose here, then, is target practice. You play LessWrong the way others play PUBG. And there may be many spitballers here, you’re just more prolific.
* except this guy, whom I thank for illustrating my point that existing forums are unsuitable for reasonably arbitrating factual disagreements.
Part of rationality is not being in denial of reality and there are certain realities about what happens when you talk about politics.
Part of what the sequences are about is to care about reality and you prefer to be in denial of it and ignore the advice that the sequences made. Bringing up that you ignored it felt relevant to me.
Then you are wrong. Contemporary politics is one source of examples but a very bad source as described in the sequences. There’s history. In the West, we generally study history to learn from it.
The decision to invade Iraq in more recent history was driven by bad epistemics. Talking about it does not trigger people’s tribal senses the same way as talking about contemporary political conflicts.
If you go further back, there are also plenty of things that happened in the 20th century that were driven by bad epistemics.
Lastly, there’s no reason why you have to pick the most consequential examples to make your points. You don’t want people to focus on big consequences but want them to focus on the dynamics of truthseeking.
Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it’s just sad that the topic you are correct about is the LessWrong community.
How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed. And like any good rationalist, I’ve updated on this surprising new evidence. (I still think many are not, but navigating such diversity is very challenging.)
Almost every interaction I’ve ever had with you has been unpleasant. I’ve had plenty of pleasant interactions, so I’m confident about which one of us this is a property of, and you can imagine how much I believe you. Besides which, it’s implausible that you remember your thought processes in each of the hundred-ish comments you’ve made in the last year. For me to be wrong means you recollected the thought process that went into a one-sentence snipe, as in “oh yeah I remember that comment, that’s the one where I did think about what he was trying to communicate and how he could have done better, but I was busy that day and had to leave a one-sentence snipe instead.”
Odd but true. Good point.
No doubt, and there might even be many that are clear-cut and no longer political for most people. But there are no such events I am knowledgeable about.
Yes, I do. I want people to sense the big consequences, deeply and viscerally, in order to generate motivation. Still, a more academic reformulation may also be valuable.
That’s easily solved by reading the post from Eliezer about Politics is the Mind-Killer and understanding the advice it makes.
If I had only voiced that position in a comment and nowhere else, that might be true. That’s not the case, I have multiple times criticized people for not applying Politics is the Mind-Killer and using political examples to make points that aren’t about politics.
I talk about politics when I want to make a point about politics. I usually don’t talk about politics when I want to make a point about something else. If you want to make a point about politics it’s unavoidable to talk about politics.
The advice of Politics is the Mind-Killer is that you don’t talk about politics if you want to make a point that isn’t about politics because using political examples makes it harder for the point that isn’t about politics to come through. That post does not advise people not to talk about politics even if people who haven’t read it sometimes use the title as a catchphrase for the position that one shouldn’t talk about politics in general.
It’s a useful heuristic that Eliezer proposed. You write a post titled “Let’s make the truth easier to find” and in it follow heuristics that make the truth harder to find. If your actual goal would be to “Let’s make the truth easier to find” then my feedback would be valuable. Of course, if your goal is to signal that you care about the truth and have certain political positions, then my feedback feels offensive.