It seems like a few sequence posts touch on this (Guardians of Ayn Rand, Guardians of Truth, and other pieces of the craft and the community sequence). I’m not sure if they seemed irrelevant to the question you meant to be orienting around, or you were looking for newer things, or just forgot.
I guess by ideology I mean a set of ideas or beliefs that are used to rally a social movement around, which tend to become unquestionable “truths” once the social movement succeeds in gaining power. So for example theism, Communism, Aryan “master race”. The “Guardian” posts you cite do seem somewhat relevant but don’t really address the main questions I have, which I list below. (Also I didn’t find them because I was searching for “ideology” as the keyword.)
Eliezer’s posts don’t seem to address the “rallying flag” function of ideology. Given that ideologies are useful as rallying flags for people to coordinate / build alliances around but can also become increasingly harmful as they become more embedded (into e.g. education and policy) and unquestionable, what should someone trying to build a social movement do?
What to do if one observes some harmful ideology growing in influence? If you try to argue against it, you become an enemy of the movement and might suffer a lot of personal consequences. If you try to build a counter-movement, you probably end up creating its own ideology which might not be less harmful.
What to do if the harmful ideology has already taken over a whole society?
Given that ideologies are useful as rallying flags for people to coordinate / build alliances around but can also become increasingly harmful as they become more embedded (into e.g. education and policy) and unquestionable, what should someone trying to build a social movement do?
One idea is to have some sort of timed auto-destruct mechanism for the ideology. For example, have the founders and other high-status members of the movement record a video asking people to question the ideology, and giving a bunch of reasons for why the ideology might be false or people shouldn’t be so certain about it, to be released after the movement succeeds in gaining power. People concerned about ideologies could try to privately talk the leaders into doing this. But with deepfakes being possible, this might not work so well in the future (and also the timing mechanism seems tricky to get right) so I wonder what else can be done.
My guess is that there are fragments of things addressing at least part of this, just not oriented around ideology as a keyword (belief as attire, professing and cheering, fable of science and politics). I guess one thing is that much of the sequence are focused on “here is a way for beliefs to be wrong” rather than examining more closely why having this way-of-treating-beliefs might be useful. (Although Robin Hanson’s work I think often explores that more directly)
What to do if you spot a harmful ideology is a political question, and in some cases the answer might be pretty orthogonal to rationality. (although you might mean the more specific subquestion of “how to stop harmful ideologies while maintaining/raising the sanity waterline.” i.e. many people fight harmful ideologies with counter ideologies).
some random additional thoughts (this might also be part of what you were already thinking of, it’s just what my brain had easily available)
I think I see the word ideology as a bit more neutral than you’re phrasing here. Or at least, your examples are ‘generally accepted around here as false/bad’. But, LessWrong has an overall ideology of beliefs-that-we-coordinate around, complete with “those beliefs being object-level useful” and “some people using those beliefs as attire, sometimes for reasons that are plausibly virtuous and sometimes for reasons that seem like exactly the sort of thing Eliezer wrote the sequences to complain about.
Science also has an ideology (similar but different from Yudkowskianism). The sequences also cover “how to address wrongness in the science ideology”, I think. For example in Science or Bayes:
In physics, you can get absolutely clear-cut issues. Not in the sense that the issues are trivial to explain. But if you try to apply Bayes to healthcare, or economics, you may not be able to formally lay out what is the simplest hypothesis, or what the evidence supports. But when I say “macroscopic decoherence is simpler than collapse” it is actually strict simplicity; you could write the two hypotheses out as computer programs and count the lines of code. Nor is the evidence itself in dispute.
I wanted a very clear example—Bayes says “zig”, this is a zag—when it came time to break your allegiance to Science. [emphasis mine]
“Oh, sure,” you say, “the physicists messed up the many-worlds thing, but give them a break, Eliezer! No one ever claimed that the social process of science was perfect. People are human; they make mistakes.”
But the physicists who refuse to adopt many-worlds aren’t disobeying the rules of Science. They’re obeying the rules of Science.
The tradition handed down through the generations says that a new physics theory comes up with new experimental predictions that distinguish it from the old theory. You perform the test, and the new theory is confirmed or falsified. If it’s confirmed, you hold a huge celebration, call the newspapers, and hand out Nobel Prizes for everyone; any doddering old emeritus professors who refuse to convert are quietly humored. If the theory is disconfirmed, the lead proponent publicly recants, and gains a reputation for honesty.
So, one way to fight bad/wrong/incomplete ideology is… well, to argue against it, if you’re in an environment where that sort of thing works. If you’re not in an environment conducive to clear argument, the obvious choices are “first try to make the environment conducive to argument” or, well, various dark-artsy rhetorical flourishes that work symmetrically whether your ideas are good or not.
It seems like you have more specific questions in mind (would be curious what your motivating examples are).
The way I’d have carved up your question space is less like “how to stop/fight ideologies” and more like “what to do about the general fact of some sets of beliefs becoming sticky over time?”
The sequences also touch upon, in response to the claim “Death is good because it kills old scientists that are stuck in their ways, which allows science to march forward”, to which Eliezer replies “Jesus Christ sure, but you can just make scientists retire without killing them.” But, you do still need to implement the part where you actually make them retire as public figures.
What to do if you spot a harmful ideology is a political question, and in some cases the answer might be pretty orthogonal to rationality. (although you might mean the more specific subquestion of “how to stop harmful ideologies while maintaining/raising the sanity waterline.” i.e. many people fight harmful ideologies with counter ideologies).
Right, politics as usual seems to imply a sequence of ideologies replacing each other, and it might just be a random walk as far as how beneficial/harmful the ideologies are. My question is how to do better than that.
It seems like you have more specific questions in mind (would be curious what your motivating examples are).
My original motivating examples came from contemporary US politics, so it’s probably better not to bring them up here, but I’m now also worried about the implications for the “long reflection” / “great deliberation”.
first try to make the environment conducive to argument
By doing what? I mean it seems possible to build environments conducive to argument for a relatively small group of people, like LW, but I don’t know what can be done to push a whole society in that direction, so that’s part of my question.
The way I’d have carved up your question space is less like “how to stop/fight ideologies” and more like “what to do about the general fact of some sets of beliefs becoming sticky over time?”
I think I’m still more inclined to use the first framing, because if we make beliefs less sticky, it might just speed up the cycles of ideologies replacing each other, and it seems like the bigger problem is “beliefs as rallying flags” (i.e., beliefs can selected for because they are good rallying flags instead of for epistemic reasons).
I’d have no problem with turning it into a top-level question post, if that’s something you can do. (I posted it in Open Thread in case there was already some sequence of posts that directly addressed my questions, that I simply missed.) It not, I may write a question post after I do some more research and think/talk things over.
Might be useful to taboo ideology.
It seems like a few sequence posts touch on this (Guardians of Ayn Rand, Guardians of Truth, and other pieces of the craft and the community sequence). I’m not sure if they seemed irrelevant to the question you meant to be orienting around, or you were looking for newer things, or just forgot.
I guess by ideology I mean a set of ideas or beliefs that are used to rally a social movement around, which tend to become unquestionable “truths” once the social movement succeeds in gaining power. So for example theism, Communism, Aryan “master race”. The “Guardian” posts you cite do seem somewhat relevant but don’t really address the main questions I have, which I list below. (Also I didn’t find them because I was searching for “ideology” as the keyword.)
Eliezer’s posts don’t seem to address the “rallying flag” function of ideology. Given that ideologies are useful as rallying flags for people to coordinate / build alliances around but can also become increasingly harmful as they become more embedded (into e.g. education and policy) and unquestionable, what should someone trying to build a social movement do?
What to do if one observes some harmful ideology growing in influence? If you try to argue against it, you become an enemy of the movement and might suffer a lot of personal consequences. If you try to build a counter-movement, you probably end up creating its own ideology which might not be less harmful.
What to do if the harmful ideology has already taken over a whole society?
One idea is to have some sort of timed auto-destruct mechanism for the ideology. For example, have the founders and other high-status members of the movement record a video asking people to question the ideology, and giving a bunch of reasons for why the ideology might be false or people shouldn’t be so certain about it, to be released after the movement succeeds in gaining power. People concerned about ideologies could try to privately talk the leaders into doing this. But with deepfakes being possible, this might not work so well in the future (and also the timing mechanism seems tricky to get right) so I wonder what else can be done.
My guess is that there are fragments of things addressing at least part of this, just not oriented around ideology as a keyword (belief as attire, professing and cheering, fable of science and politics). I guess one thing is that much of the sequence are focused on “here is a way for beliefs to be wrong” rather than examining more closely why having this way-of-treating-beliefs might be useful. (Although Robin Hanson’s work I think often explores that more directly)
What to do if you spot a harmful ideology is a political question, and in some cases the answer might be pretty orthogonal to rationality. (although you might mean the more specific subquestion of “how to stop harmful ideologies while maintaining/raising the sanity waterline.” i.e. many people fight harmful ideologies with counter ideologies).
some random additional thoughts (this might also be part of what you were already thinking of, it’s just what my brain had easily available)
I think I see the word ideology as a bit more neutral than you’re phrasing here. Or at least, your examples are ‘generally accepted around here as false/bad’. But, LessWrong has an overall ideology of beliefs-that-we-coordinate around, complete with “those beliefs being object-level useful” and “some people using those beliefs as attire, sometimes for reasons that are plausibly virtuous and sometimes for reasons that seem like exactly the sort of thing Eliezer wrote the sequences to complain about.
Science also has an ideology (similar but different from Yudkowskianism). The sequences also cover “how to address wrongness in the science ideology”, I think. For example in Science or Bayes:
(Paul Graham’s “What you can’t say” is also relevant)
So, one way to fight bad/wrong/incomplete ideology is… well, to argue against it, if you’re in an environment where that sort of thing works. If you’re not in an environment conducive to clear argument, the obvious choices are “first try to make the environment conducive to argument” or, well, various dark-artsy rhetorical flourishes that work symmetrically whether your ideas are good or not.
It seems like you have more specific questions in mind (would be curious what your motivating examples are).
The way I’d have carved up your question space is less like “how to stop/fight ideologies” and more like “what to do about the general fact of some sets of beliefs becoming sticky over time?”
The sequences also touch upon, in response to the claim “Death is good because it kills old scientists that are stuck in their ways, which allows science to march forward”, to which Eliezer replies “Jesus Christ sure, but you can just make scientists retire without killing them.” But, you do still need to implement the part where you actually make them retire as public figures.
Right, politics as usual seems to imply a sequence of ideologies replacing each other, and it might just be a random walk as far as how beneficial/harmful the ideologies are. My question is how to do better than that.
My original motivating examples came from contemporary US politics, so it’s probably better not to bring them up here, but I’m now also worried about the implications for the “long reflection” / “great deliberation”.
By doing what? I mean it seems possible to build environments conducive to argument for a relatively small group of people, like LW, but I don’t know what can be done to push a whole society in that direction, so that’s part of my question.
I think I’m still more inclined to use the first framing, because if we make beliefs less sticky, it might just speed up the cycles of ideologies replacing each other, and it seems like the bigger problem is “beliefs as rallying flags” (i.e., beliefs can selected for because they are good rallying flags instead of for epistemic reasons).
(btw, I think this comment would work well as a question, which might make it easier to reference in the future)
I’d have no problem with turning it into a top-level question post, if that’s something you can do. (I posted it in Open Thread in case there was already some sequence of posts that directly addressed my questions, that I simply missed.) It not, I may write a question post after I do some more research and think/talk things over.