While there may be a substantial worldview gap, I suspect the much larger difference is that most Sneer Clubbers are looking to boost their status by trying to bully anyone who looks like a vulnerable target, and being different, as LessWrong is, is enough to qualify. This situation is best modeled by conflict theory, not mistake theory.
Since that does not seem likely to be the sort of answer you’re looking for though, if I wanted to bridge the inferential gap with a hypothetical Sneer Clubber who genuinely cared about truth, or indeed about anything other than status (which they do not), I’d tell them that convention doesn’t work as well as one might think. If you think that the conventional way to approach the world is usually right, the rationalist community will seem unusually stupid. We ignore all this free wisdom lying around and try to reinvent the wheel! If the conventional wisdom is correct, then concerns about the world changing, whether due to AI or any other reason, are pointless. If they were important, conventional wisdom would already be talking about them. If the conventional wisdom is correct, Bayesianism is potentially wrong (it’s not part of the Standard Approach to Life), and certainly useless: why try to learn through probability theory when tradition can tell you everything you need to know much faster? But I would tell them that in a world where the conventional wisdom was embarrassingly wrong in all previous eras, it would be a real coincidence for this age to be the first to get everything right. And if tradition isn’t perfect, or nearly so, that’s when rationalism suddenly becomes very important.
I would also tell them that it’s possible to actually understand things. Most people seem to go through life on rote, seemingly not recognizing when something doesn’t make sense because they don’t expect anything to make sense. But it’s possible to start thinking through how things work, and when you do that, rationality starts seeming sensible because you can see how it works and that it works, rather than silly because it superficially pattern matches to a Scientology style cult.
Of course. While I believe dialogue with them to be unproductive, it’s only polite to actually answer your question too. Doubly so if I’d been wrong about them shutting down discussion, but as we’ve seen, that didn’t take long.
Be careful. Politics is the mind killer in more ways than one. It’s all too easy to mindlessly hate an enemy. But it’s also possible to over correct, and assume that everyone must have a point, or be plausibly correct in their own eyes. I am reminded of how Scott was genuinely surprised when the NY Times did a hatchet job on him: he specializes in charitable interpretations of things, and can often write fascinating pieces about how something that initially seemed absurd can look reasonable from a certain point of view. That’s a great skill to have, but he seems to have neglected that true malice also exists, even if it can be tempting to attribute it too often.
I predict that these people can be accurately modeled as status maximizers coordinating around a leftist narrative, who do not actually anticipate as if that narrative is correct, and who do not have significant values beyond status seeking and maybe a little bit of more generalized self interest. If you disagree, it makes more sense to let observation settle this than to simply note that I am making a generalized claim about an outgroup. I could be wrong, of course, but my model of them has proven quite accurate so far. Certainly it seems closer to the truth than the model of someone who thought a productive discussion with them was possible: already we have seen a mod shut down the discussion from their end and come here to contribute nothing but, y’know, sneering.
While there may be a substantial worldview gap, I suspect the much larger difference is that most Sneer Clubbers are looking to boost their status by trying to bully anyone who looks like a vulnerable target, and being different, as LessWrong is, is enough to qualify. This situation is best modeled by conflict theory, not mistake theory.
Since that does not seem likely to be the sort of answer you’re looking for though, if I wanted to bridge the inferential gap with a hypothetical Sneer Clubber who genuinely cared about truth, or indeed about anything other than status (which they do not), I’d tell them that convention doesn’t work as well as one might think. If you think that the conventional way to approach the world is usually right, the rationalist community will seem unusually stupid. We ignore all this free wisdom lying around and try to reinvent the wheel! If the conventional wisdom is correct, then concerns about the world changing, whether due to AI or any other reason, are pointless. If they were important, conventional wisdom would already be talking about them. If the conventional wisdom is correct, Bayesianism is potentially wrong (it’s not part of the Standard Approach to Life), and certainly useless: why try to learn through probability theory when tradition can tell you everything you need to know much faster? But I would tell them that in a world where the conventional wisdom was embarrassingly wrong in all previous eras, it would be a real coincidence for this age to be the first to get everything right. And if tradition isn’t perfect, or nearly so, that’s when rationalism suddenly becomes very important.
I would also tell them that it’s possible to actually understand things. Most people seem to go through life on rote, seemingly not recognizing when something doesn’t make sense because they don’t expect anything to make sense. But it’s possible to start thinking through how things work, and when you do that, rationality starts seeming sensible because you can see how it works and that it works, rather than silly because it superficially pattern matches to a Scientology style cult.
Thank you for answer that addresses the prompt!
Of course. While I believe dialogue with them to be unproductive, it’s only polite to actually answer your question too. Doubly so if I’d been wrong about them shutting down discussion, but as we’ve seen, that didn’t take long.
Your outgroup is not homogeneous, it just seems that way.
Be careful. Politics is the mind killer in more ways than one. It’s all too easy to mindlessly hate an enemy. But it’s also possible to over correct, and assume that everyone must have a point, or be plausibly correct in their own eyes. I am reminded of how Scott was genuinely surprised when the NY Times did a hatchet job on him: he specializes in charitable interpretations of things, and can often write fascinating pieces about how something that initially seemed absurd can look reasonable from a certain point of view. That’s a great skill to have, but he seems to have neglected that true malice also exists, even if it can be tempting to attribute it too often.
I predict that these people can be accurately modeled as status maximizers coordinating around a leftist narrative, who do not actually anticipate as if that narrative is correct, and who do not have significant values beyond status seeking and maybe a little bit of more generalized self interest. If you disagree, it makes more sense to let observation settle this than to simply note that I am making a generalized claim about an outgroup. I could be wrong, of course, but my model of them has proven quite accurate so far. Certainly it seems closer to the truth than the model of someone who thought a productive discussion with them was possible: already we have seen a mod shut down the discussion from their end and come here to contribute nothing but, y’know, sneering.
Or to put it more succinctly: it should not be surprising that a website for bullying is full of bullies.
Is a special case of “tribalism is the mind killer”
Whereas your ingroup must be something different, because ingroups and outgroups never have anything in common.
Ditto.