Please share ideas/articles/resources for immunizing ones’ kids against mind viruses.
I think I was lucky myself in that I was partially indoctrinated in Communist China, then moved to the US before middle school, which made it hard for me to strongly believe any particular religion or ideology. Plus the US schools I went to didn’t seem to emphasize ideological indoctrination as much as schools currently do. Plus there was no social media pushing students to express the same beliefs as their classmates.
What can I do to help prepare my kids? (If you have specific ideas or advice, please mention what age or grade they are appropriate for.)
Do you think that having your kids consume rationalist and effective altruist content and/or doing homeschooling/unschooling are insufficient for protecting your kids against mind viruses? If so, I want to understand why you think so (maybe you’re imagining some sort of AI-powered memetic warfare?).
Eliezer has a Facebook post where he talks about how being socialized by old science fiction was helpful for him.
For myself, I think the biggest factors that helped me become/stay sane were spending a lot of time on the internet (which led to me discovering LessWrong, effective altruism, Cognito Mentoring) and not talking to other kids (I didn’t have any friends from US public school during grades 4 to 11).
Do you think that having your kids consume rationalist and effective altruist content and/or doing homeschooling/unschooling are insufficient for protecting your kids against mind viruses?
Homeschooling takes up too much of my time and I don’t think I’m very good at being a teacher (having been forced to try it during the current school closure). Unschooling seems too risky. (Maybe it would produce great results, but my wife would kill me if it doesn’t. :) “Consume rationalist and effective altruist content” makes sense but some more specific advice would be helpful, like what material to introduce, when, and how to encourage their interest if they’re not immediately interested. Have any parents done this and can share their experience?
and not talking to other kids (I didn’t have any friends from US public school during grades 4 to 11)
Yeah that might have been a contributing factor for myself as well, but my kids seem a lot more social than me.
“Consume rationalist and effective altruist content” makes sense but some more specific advice would be helpful, like what material to introduce, when, and how to encourage their interest if they’re not immediately interested. Have any parents done this and can share their experience?
I don’t have kids (yet) and I’m planning to delay any potential detailed research until I do have kids, so I don’t have specific advice. You could talk to James Miller and his son. Bryan Caplan seems to also be doing well in terms of keeping his sons’ views similar to his own; he does homeschool, but maybe you could learn something from looking at what he does anyway. There are a fewotherrationalist parents, but I haven’t seen any detailed info on what they do in terms of introducing rationality/EA stuff. Duncan Sabien has also thought a lot about teaching children, including designing a rationality camp for kids.
I can also give my own data point: Before discovering LessWrong (age 13-15?), I consumed a bunch of traditional rationality content like Feynman, popular science, online philosophy lectures, and lower quality online discourse like the xkcd forums. I discovered LessWrong when I was 14-16 (I don’t remember the exact date) and read a bunch of posts in an unstructured way (e.g. I think I read about half of the Sequences but not in order), and concurrently read things like GEB and started learning how to write mathematical proofs. That was enough to get me to stick around, and led to me discovering EA, getting much deeper into rationality, AI safety, LessWrongian philosophy, etc. I feel like I could have started much earlier though (maybe 9-10?) and that it was only because of my bad environment (in particular, having nobody tell me that LessWrong/Overcoming Bias existed) and poor English ability (I moved to the US when I was 10 and couldn’t read/write English at the level of my peers until age 16 or so) that I had to start when I did.
If you’re looking for a datapoint, I found and read this ePub of all of Eliezer’s writing when I was around 13 or 14. Would read it late into the night every day (1am, 2am) on the tablet I had at the time, I think an iPhone.
Before that… the first book I snuck out to buy+read was Sam Harris’s “Letter to a Christian Nation” when I was 12-13, and I generally found his talks and books to be really exciting and mind-expanding.
Opening the Heart of Compassion outlines the Buddhist model of 6 deleterious configurations that people tend to fall into. On top of this I would add that much of the negative consequences of this come from our tendency towards monism: to find one thing that works and then try to build an entire worldview out of it.
1) they will believe false things (which is bad for its own sake) 2) they will do harm to others due to false beliefs 3) harm will come to them because of their false beliefs 4) they will become alienated from you because of your disagreements with each other 5) something else?
It seems like these different possibilities would suggest different mitigations. For example, if the threat model is that they just adopt the dominant ideology around them (which happens to be false on many points), then that results in them having false beliefs (#1), but may not cause any harm to come to them from it (#3) (and may even be to their benefit, in some ways).
Similarly, depending on whether you care more about #1 or #4, you may try harder to correct their false ideas, or to establish a norm for your relationship that it’s fine to disagree with each other. (Though I suspect that, generally speaking, efforts that tend to produce a healthy relationship will also tend to produce true beliefs, in the long run.)
For example, if the threat model is that they just adopt the dominant ideology around them (which happens to be false on many points), then that results in them having false beliefs (#1), but may not cause any harm to come to them from it (#3) (and may even be to their benefit, in some ways).
Many Communist true believers in China met terrible ends as waves of “political movements” swept through the country after the CCP takeover, and pitted one group against another, all vying to be the most “revolutionary”. (One of my great-grandparents could have escaped but stayed in China because he was friends with a number of high-level Communists and believed in their cause. He ended up committing suicide when his friends lost power to other factions and the government turned on him.)
More generally, ideology can change so quickly that it’s very difficult to follow it closely enough to stay safe, and even if you did follow the dominant ideology perfectly you’re still vulnerable to the next “vanguard” who pushes the ideology in a new direction in order to take power. I think if “adopt the dominant ideology” is sensible as a defensive strategy for living in some society, you’d still really want to avoid getting indoctrinated into being a true believer, so you can apply rational analysis to the political struggles that will inevitably follow.
They will “waste their life”, for both the real opportunity cost and the potential regret they might feel if they realize the error later in life.
My own regret in knowing that they’ve been indoctrinated into believing wrong things (or into having unreasonable certainty about potentially wrong things), when I probably could have done something to prevent that.
Their views making family life difficult. (E.g., if they were to secretly record family conversations and post them on social media as examples of wrongthink, like some kids have done.)
Can’t really think of any mitigations for these aside from trying not to let them get indoctrinated in the first place...
I don’t have children, and my upbringing wasn’t especially good or bad on learning rationality.
Still, what I’m noticing in your post and the comments so far is the idea that rationality is something to put into your children.
I believe that rationality mostly needs to be modeled. Take your mind and your children’s connection to the universe seriously. Show them that thinking and arguing are both fun and useful.
Do you mean how to teach them critical thinking skills? Or how to get them to prize the truth over fitting in?
I’m going to assume you’re not a radical leftist. What if your 16 year old kid started sharing every leftist meme because they’ve really thought about it and think it’s true? What if they said “it doesn’t matter if there’s pressure to hold these political opinions; they’re as true as gravity!”
Would you count that as a success, since they’re bold enough to stand up to an authority figure (you) to honestly express their deeply-considered views? Or a failure? If the latter, why?
I’m going to assume you’re not a radical leftist. What if your 16 year old kid started sharing every leftist meme because they’ve really thought about it and think it’s true?
I don’t think that most people who really think issues through agree with every leftists meme and think the meme is true. Part of modern leftish ideology is that you should say certain things even when they are not true, because you want to show solidarity. There’s also a belief that certain values shouldn’t be “thought through”. They are sacred and not supposed to be questioned.
It sounds like you’re setting the bar for epistemic hygiene (i.e. not being infected by a mind virus) at being able to justify your worldview from the ground up. Is that an isolated demand for rigor, or would you view anyone unable to do that as an unreasonable conformist?
I think you ignore that plenty of people do believe in epistemics that value not engaging in critical analysis in the sense of critical thinking but only in the sense of critical theory.
In leftish activism people are expected to be able to approve at the same time of the meme “homophobia should always be challenged” and “Islam shouldn’t be challenged”. Explicit discussions about how those values should be traded of against each other are shunned because they violate the underlying sacredness.
Frequently, there’s an idea that beliefs should be based on experience or trusting people with experience and not based on thinking thing things through. Valuing thinking things through is not universal.
I’m just not convinced that the radical left has epistemic norms or value priorities that are unusually bad. Imagine you were about to introduce me to five of your friends to talk politics. One identifies as a radical leftist, one a progressive moderate, another a libertarian, the fourth a conservative, and the fifth apolitical. All five of them share a lot of memes on Facebook. They also each have a blog where they write about their political opinions.
I would not be particularly surprised if I had a thoughtful, stimulating conversation with any of them.
My prior is that intellectual profiling based on ideology isn’t a good way to predict how thoughtful somebody is.
So for me, if Wei Dei Jr. turned out to be a 16 year old radical leftist, I wouldn’t think he’s any more conformist than if he’d turned out to be a progressive, libertarian, conservative, or apolitical.
That might just be a crux of disagreement for us based on differing experiences in interacting with each of these groups.
A 16yo going into the modern school system and turning into a radical leftist is much more often than not a failure state than a success state.
Young leftist conformists outnumber the thought-out and well-reasoned young leftists by at least 10 to 1 so that’s where our prior should be at. Hypothetical Wei then has a few conversations with his hypothetical, radical leftist kid and the kid reasons well for a 16yo. We would expect a well-reasoned leftist to reason well more often than a conformed leftist so that updates our priors, but I don’t think we’d go as far as saying that it overcomes our original 10 to 1 prior. Well-reasoned people only make arguments sound well-reasoned to others maybe 90% of the time max and even conformists can make nice-sounding arguments (for a 16yo) fairly often.
Even after the conversations, it’s still more likely that the hypothetical radical leftist kid is a conformist rather than well-reasoned. If hypothetical Wei had some ability to determine to a high degree of certainty whether his kid was a conformist or well-reasoned then that would be a very different case and he likely wouldn’t have the concerns that his children will be indoctrinated that he expressed in the original post.
You’re neglecting the base rate of 16 year old conformity. I think this is some pretty silly speculation, but let’s run with it. Isn’t the base rate for 16 year old conformity at least 10 to 1? If so, a 16 year old who’s a leftist is no more likely to be a conformist than any other.
In the end, what we’re looking for is a reliable signal that, whatever the 16 year old thinks, it’s due to their independent reasoning.
Widely shared reasonable beliefs won’t cut it, because they wouldn’t have to think it out for themselves. Outrageous contrarian views won’t work, because that’s not reasonable.
You’d have to look for them to hold views that are both reasonable and contrarian. So, a genius. Is that a realistic bar to diagnose your kid as uninfected by mind viruses?
Ideological conformity in the school system is not uniform. A person turning left when everybody else is turning right is much less likely to be a conformist than someone else turning right.
ETA: Without metaphor, our priors for conformist vs. well-reasoned is different for young rightists or non-leftists in the school system.
My daughter is 2. Everything we do with her is either indoctrination or play; she doesn’t have enough language yet for the learning-begets-learning we naturally assume with older kids and adults.
I was in the military, which is probably the most successful employer of indoctrination in the US. I believe the key to this success rests with the clarity of the indoctrination’s purpose and effectiveness: the purpose is to keep everyone on the same page, because if we aren’t our people will die (where our people means the unit). Indoctrination is the only tool available for this because there isn’t time for sharing all the relevant information or doing analysis.
I plan to capture these benefits for my daughter by being specific about the fact that I’m using indoctrination and why indoctrination is a good tool for the situation instead of how we think or feel about it, when she inevitably has questions.
The bearing I think this has on the question of mind viruses is that she will know what indoctrination looks like when she sees it. Further, she will have expectations of purpose and impact; political indoctrination fails these tests, which I hope will trigger rejection (or at least forestall overcommitment).
How are you handling the problem that rationality will often pay negative if not over a critical mass (e.g., it often leads to poor signaling or anti-signaling if one is lucky)?
Please share ideas/articles/resources for immunizing ones’ kids against mind viruses.
I think I was lucky myself in that I was partially indoctrinated in Communist China, then moved to the US before middle school, which made it hard for me to strongly believe any particular religion or ideology. Plus the US schools I went to didn’t seem to emphasize ideological indoctrination as much as schools currently do. Plus there was no social media pushing students to express the same beliefs as their classmates.
What can I do to help prepare my kids? (If you have specific ideas or advice, please mention what age or grade they are appropriate for.)
Do you think that having your kids consume rationalist and effective altruist content and/or doing homeschooling/unschooling are insufficient for protecting your kids against mind viruses? If so, I want to understand why you think so (maybe you’re imagining some sort of AI-powered memetic warfare?).
Eliezer has a Facebook post where he talks about how being socialized by old science fiction was helpful for him.
For myself, I think the biggest factors that helped me become/stay sane were spending a lot of time on the internet (which led to me discovering LessWrong, effective altruism, Cognito Mentoring) and not talking to other kids (I didn’t have any friends from US public school during grades 4 to 11).
Homeschooling takes up too much of my time and I don’t think I’m very good at being a teacher (having been forced to try it during the current school closure). Unschooling seems too risky. (Maybe it would produce great results, but my wife would kill me if it doesn’t. :) “Consume rationalist and effective altruist content” makes sense but some more specific advice would be helpful, like what material to introduce, when, and how to encourage their interest if they’re not immediately interested. Have any parents done this and can share their experience?
Yeah that might have been a contributing factor for myself as well, but my kids seem a lot more social than me.
I don’t have kids (yet) and I’m planning to delay any potential detailed research until I do have kids, so I don’t have specific advice. You could talk to James Miller and his son. Bryan Caplan seems to also be doing well in terms of keeping his sons’ views similar to his own; he does homeschool, but maybe you could learn something from looking at what he does anyway. There are a few other rationalist parents, but I haven’t seen any detailed info on what they do in terms of introducing rationality/EA stuff. Duncan Sabien has also thought a lot about teaching children, including designing a rationality camp for kids.
I can also give my own data point: Before discovering LessWrong (age 13-15?), I consumed a bunch of traditional rationality content like Feynman, popular science, online philosophy lectures, and lower quality online discourse like the xkcd forums. I discovered LessWrong when I was 14-16 (I don’t remember the exact date) and read a bunch of posts in an unstructured way (e.g. I think I read about half of the Sequences but not in order), and concurrently read things like GEB and started learning how to write mathematical proofs. That was enough to get me to stick around, and led to me discovering EA, getting much deeper into rationality, AI safety, LessWrongian philosophy, etc. I feel like I could have started much earlier though (maybe 9-10?) and that it was only because of my bad environment (in particular, having nobody tell me that LessWrong/Overcoming Bias existed) and poor English ability (I moved to the US when I was 10 and couldn’t read/write English at the level of my peers until age 16 or so) that I had to start when I did.
If you’re looking for a datapoint, I found and read this ePub of all of Eliezer’s writing when I was around 13 or 14. Would read it late into the night every day (1am, 2am) on the tablet I had at the time, I think an iPhone.
Before that… the first book I snuck out to buy+read was Sam Harris’s “Letter to a Christian Nation” when I was 12-13, and I generally found his talks and books to be really exciting and mind-expanding.
Opening the Heart of Compassion outlines the Buddhist model of 6 deleterious configurations that people tend to fall into. On top of this I would add that much of the negative consequences of this come from our tendency towards monism: to find one thing that works and then try to build an entire worldview out of it.
Are you most concerned that:
1) they will believe false things (which is bad for its own sake)
2) they will do harm to others due to false beliefs
3) harm will come to them because of their false beliefs
4) they will become alienated from you because of your disagreements with each other
5) something else?
It seems like these different possibilities would suggest different mitigations. For example, if the threat model is that they just adopt the dominant ideology around them (which happens to be false on many points), then that results in them having false beliefs (#1), but may not cause any harm to come to them from it (#3) (and may even be to their benefit, in some ways).
Similarly, depending on whether you care more about #1 or #4, you may try harder to correct their false ideas, or to establish a norm for your relationship that it’s fine to disagree with each other. (Though I suspect that, generally speaking, efforts that tend to produce a healthy relationship will also tend to produce true beliefs, in the long run.)
I should also address this part:
Many Communist true believers in China met terrible ends as waves of “political movements” swept through the country after the CCP takeover, and pitted one group against another, all vying to be the most “revolutionary”. (One of my great-grandparents could have escaped but stayed in China because he was friends with a number of high-level Communists and believed in their cause. He ended up committing suicide when his friends lost power to other factions and the government turned on him.)
More generally, ideology can change so quickly that it’s very difficult to follow it closely enough to stay safe, and even if you did follow the dominant ideology perfectly you’re still vulnerable to the next “vanguard” who pushes the ideology in a new direction in order to take power. I think if “adopt the dominant ideology” is sensible as a defensive strategy for living in some society, you’d still really want to avoid getting indoctrinated into being a true believer, so you can apply rational analysis to the political struggles that will inevitably follow.
I guess I’m worried about
They will “waste their life”, for both the real opportunity cost and the potential regret they might feel if they realize the error later in life.
My own regret in knowing that they’ve been indoctrinated into believing wrong things (or into having unreasonable certainty about potentially wrong things), when I probably could have done something to prevent that.
Their views making family life difficult. (E.g., if they were to secretly record family conversations and post them on social media as examples of wrongthink, like some kids have done.)
Can’t really think of any mitigations for these aside from trying not to let them get indoctrinated in the first place...
I don’t have children, and my upbringing wasn’t especially good or bad on learning rationality.
Still, what I’m noticing in your post and the comments so far is the idea that rationality is something to put into your children.
I believe that rationality mostly needs to be modeled. Take your mind and your children’s connection to the universe seriously. Show them that thinking and arguing are both fun and useful.
Do you mean how to teach them critical thinking skills? Or how to get them to prize the truth over fitting in?
I’m going to assume you’re not a radical leftist. What if your 16 year old kid started sharing every leftist meme because they’ve really thought about it and think it’s true? What if they said “it doesn’t matter if there’s pressure to hold these political opinions; they’re as true as gravity!”
Would you count that as a success, since they’re bold enough to stand up to an authority figure (you) to honestly express their deeply-considered views? Or a failure? If the latter, why?
I don’t think that most people who really think issues through agree with every leftists meme and think the meme is true. Part of modern leftish ideology is that you should say certain things even when they are not true, because you want to show solidarity. There’s also a belief that certain values shouldn’t be “thought through”. They are sacred and not supposed to be questioned.
It sounds like you’re setting the bar for epistemic hygiene (i.e. not being infected by a mind virus) at being able to justify your worldview from the ground up. Is that an isolated demand for rigor, or would you view anyone unable to do that as an unreasonable conformist?
I think you ignore that plenty of people do believe in epistemics that value not engaging in critical analysis in the sense of critical thinking but only in the sense of critical theory.
In leftish activism people are expected to be able to approve at the same time of the meme “homophobia should always be challenged” and “Islam shouldn’t be challenged”. Explicit discussions about how those values should be traded of against each other are shunned because they violate the underlying sacredness.
Frequently, there’s an idea that beliefs should be based on experience or trusting people with experience and not based on thinking thing things through. Valuing thinking things through is not universal.
I’m just not convinced that the radical left has epistemic norms or value priorities that are unusually bad. Imagine you were about to introduce me to five of your friends to talk politics. One identifies as a radical leftist, one a progressive moderate, another a libertarian, the fourth a conservative, and the fifth apolitical. All five of them share a lot of memes on Facebook. They also each have a blog where they write about their political opinions.
I would not be particularly surprised if I had a thoughtful, stimulating conversation with any of them.
My prior is that intellectual profiling based on ideology isn’t a good way to predict how thoughtful somebody is.
So for me, if Wei Dei Jr. turned out to be a 16 year old radical leftist, I wouldn’t think he’s any more conformist than if he’d turned out to be a progressive, libertarian, conservative, or apolitical.
That might just be a crux of disagreement for us based on differing experiences in interacting with each of these groups.
A 16yo going into the modern school system and turning into a radical leftist is much more often than not a failure state than a success state.
Young leftist conformists outnumber the thought-out and well-reasoned young leftists by at least 10 to 1 so that’s where our prior should be at. Hypothetical Wei then has a few conversations with his hypothetical, radical leftist kid and the kid reasons well for a 16yo. We would expect a well-reasoned leftist to reason well more often than a conformed leftist so that updates our priors, but I don’t think we’d go as far as saying that it overcomes our original 10 to 1 prior. Well-reasoned people only make arguments sound well-reasoned to others maybe 90% of the time max and even conformists can make nice-sounding arguments (for a 16yo) fairly often.
Even after the conversations, it’s still more likely that the hypothetical radical leftist kid is a conformist rather than well-reasoned. If hypothetical Wei had some ability to determine to a high degree of certainty whether his kid was a conformist or well-reasoned then that would be a very different case and he likely wouldn’t have the concerns that his children will be indoctrinated that he expressed in the original post.
You’re neglecting the base rate of 16 year old conformity. I think this is some pretty silly speculation, but let’s run with it. Isn’t the base rate for 16 year old conformity at least 10 to 1? If so, a 16 year old who’s a leftist is no more likely to be a conformist than any other.
In the end, what we’re looking for is a reliable signal that, whatever the 16 year old thinks, it’s due to their independent reasoning.
Widely shared reasonable beliefs won’t cut it, because they wouldn’t have to think it out for themselves. Outrageous contrarian views won’t work, because that’s not reasonable.
You’d have to look for them to hold views that are both reasonable and contrarian. So, a genius. Is that a realistic bar to diagnose your kid as uninfected by mind viruses?
Ideological conformity in the school system is not uniform. A person turning left when everybody else is turning right is much less likely to be a conformist than someone else turning right.
ETA: Without metaphor, our priors for conformist vs. well-reasoned is different for young rightists or non-leftists in the school system.
My daughter is 2. Everything we do with her is either indoctrination or play; she doesn’t have enough language yet for the learning-begets-learning we naturally assume with older kids and adults.
I was in the military, which is probably the most successful employer of indoctrination in the US. I believe the key to this success rests with the clarity of the indoctrination’s purpose and effectiveness: the purpose is to keep everyone on the same page, because if we aren’t our people will die (where our people means the unit). Indoctrination is the only tool available for this because there isn’t time for sharing all the relevant information or doing analysis.
I plan to capture these benefits for my daughter by being specific about the fact that I’m using indoctrination and why indoctrination is a good tool for the situation instead of how we think or feel about it, when she inevitably has questions.
The bearing I think this has on the question of mind viruses is that she will know what indoctrination looks like when she sees it. Further, she will have expectations of purpose and impact; political indoctrination fails these tests, which I hope will trigger rejection (or at least forestall overcommitment).
How are you handling the problem that rationality will often pay negative if not over a critical mass (e.g., it often leads to poor signaling or anti-signaling if one is lucky)?