I’m thinking the very hardest bit is going to be getting across to people that it can happen to you. [...] The predatory memes have evolved to eat people who think it can’t happen to them.
Certainly there are people who can’t be infected with strong cultish memes, and when those people believe that it can’t happen to them, they are correct. There are also people who believe so incorrectly, but this is not a strong argument for impossibility of holding that belief correctly. You seem to be overstating the case, implying undue confidence.
Yes, I’m seeming to state it as 1 rather than a high percentage. This is hyperbole, sorry. What I mean to get across is that it’s higher than most people think. Particularly ones who consider they think better than others. Thinking better than most people isn’t actually that hard, and sufficient LessWrong and you may think quite a lot better. You still have all your cognitive biases—they’re in the buggy, corrupt hardware. Knowing about them doesn’t grant you immunity to them.
WrongBot gives an anecdote of just how wrong a brain can be. You Are Not So Smart’s about page gives a summary of the problem and the blog itself gives the examples. I try to notice my own stupidities and I miss a ton (my loved ones are happy to help my awareness). In general, people don’t have a keen sense for their own stupidities, and learning how to be rational can induce a hubris where one thinks one isn’t susceptible any more. (What is the correct term for this bias?)
I do think it likely that any mind will have susceptibilities and exploits. Consider the AI box experiment. Even a human can think of an argument to convince a human to do the thing they really, really shouldn’t when the subject knows the game and that the game is on; what could a human or evolved meme do when the subject isn’t aware the game is on or that there’s a game?
You still have all your cognitive biases—they’re in the buggy, corrupt hardware. Knowing about them doesn’t grant you immunity to them.
What are you arguing for using these arguments? Being protected from cults doesn’t require lack of bias, and indeed lack of bias is an unattainable idealization.
If you argue that presence of biases knowably confers overconfidence in the belief “I can’t be captured by a cult”, then correcting for that knowable bias leaves you no longer knowably biased. Since this can be said about any belief, it’s not clear why it should be said about this particular one, unless you believe that this belief is more systematically incorrect than others. But then you need to argue about what distinguishes this belief from others, not about presence of bias in general. That people are not perfectly rational is not a general argument against any belief.
what could a human or evolved meme do when the subject isn’t aware the game is on?
Contrived scenarios can surprise any belief, however correct about expected scenarios.
Certainly there are people who can’t be infected with strong cultish memes, and when those people believe that it can’t happen to them, they are correct. There are also people who believe so incorrectly, but this is not a strong argument for impossibility of holding that belief correctly. You seem to be overstating the case, implying undue confidence.
Yes, I’m seeming to state it as 1 rather than a high percentage. This is hyperbole, sorry. What I mean to get across is that it’s higher than most people think. Particularly ones who consider they think better than others. Thinking better than most people isn’t actually that hard, and sufficient LessWrong and you may think quite a lot better. You still have all your cognitive biases—they’re in the buggy, corrupt hardware. Knowing about them doesn’t grant you immunity to them.
WrongBot gives an anecdote of just how wrong a brain can be. You Are Not So Smart’s about page gives a summary of the problem and the blog itself gives the examples. I try to notice my own stupidities and I miss a ton (my loved ones are happy to help my awareness). In general, people don’t have a keen sense for their own stupidities, and learning how to be rational can induce a hubris where one thinks one isn’t susceptible any more. (What is the correct term for this bias?)
I do think it likely that any mind will have susceptibilities and exploits. Consider the AI box experiment. Even a human can think of an argument to convince a human to do the thing they really, really shouldn’t when the subject knows the game and that the game is on; what could a human or evolved meme do when the subject isn’t aware the game is on or that there’s a game?
What are you arguing for using these arguments? Being protected from cults doesn’t require lack of bias, and indeed lack of bias is an unattainable idealization.
If you argue that presence of biases knowably confers overconfidence in the belief “I can’t be captured by a cult”, then correcting for that knowable bias leaves you no longer knowably biased. Since this can be said about any belief, it’s not clear why it should be said about this particular one, unless you believe that this belief is more systematically incorrect than others. But then you need to argue about what distinguishes this belief from others, not about presence of bias in general. That people are not perfectly rational is not a general argument against any belief.
Contrived scenarios can surprise any belief, however correct about expected scenarios.