So I was brainstorming recently with a friend about this very topic: how to convince someone to support a goal of rationality (existential risk reduction, AI study), who doesn’t enjoy engaging in rational reasoning. Like I’d love to be able to persuade my random gen pop friend to be vegetarian, or think about the real disparities in the world, or about the implications of our actions if they were scaled up.
2 possible strategies I came up with, and all border on Rhetoric, Persuasion, and I guess the Dark Arts in general are
Inspiration: “Inspire” them to value the rational process aka philosophical reasoning and evidence based reasoning. Inspiration involves realizing that something is socially virtuous, aesthetically pleasing, or has good instrumental results.
I think immersing them in a rational community and using their social reasoning values would work for many people who then have to use the tools of rational argument to get status or to see themselves as socially virtuous (like LessWrong Meet-Ups). Maybe they would realize how powerful the tools are for preventing deception or cheating at some point, and come to value the tools of rationality apart from their use in status games.
Art in the form of books or movies that encourage deep deductive reasoning from evidence that rewards the heroes might be even better for this seeing as HPMOR is seen as a big draw for the gen pop to LessWrong. Examples are HPMOR, Bill Nye the Science Guy, Cosmos, the Skeptics Guide to the Galaxy. Anything that inspires an aesthetic Love of science, reasoning, or philosophy. I’ve yet to see the Scout Mindset inspire a wave of converts (although I’d be curious to analyze LessWrong / EA Forum usage after the book was published), and my friend didn’t “covert” after listening to it. It does have useful tools for the converted though.
Finally conflating success at status, money, or relationships with rational argumentation or cohesion and correlation epistemological values. This would plant the seed of curiosity about how these tools are so effective, then people will eventually generalize these lessons, then boom. They’re unknowingly rationalists and will possibly join the community itself once they find it.
Approach from the Side: Identities are powerful. You can emphasize the virtuousness of one part of peoples’ identities as being consistent with the virtuousness of another part of their identity. Eventually you can weave these syllogisms together to support an “actually good value”. For Example, I find that every liberal and conservative pays lip service to the ideas of a ‘post-truth world’ or ‘fake news’, and this is great because one can use this piece of their identity to make them more curious about what is true. When I did this with my parents it went generally something like this.
I like A (hating on people who tell lies)
A is similar to B (liking people who tell the truth)
B is similar to C (liking scientists who tell more truth, better than people who tell less truth)
C is similar to D (I like more truth, better than less truth)
D is similar to E (I like the process that creates more truth, better than the process that creates less truth)
E is similar to …. Z (I like consistent principles that build up an accurate predictive model of the world within my mind and are very generally useful)
My approach from the side technique worked briefly with my parents, maybe for like 2 or 4hrs they were curious about evidence for their beliefs and even some epistemology. Their belief statements went back to equilibrium (total confidence, signaling, etc.) after that though, although maybe it has shifted slightly in an undiscernible way. Anyways, the brief flicker of light I saw in them still feels worth it, but it might not for you.
Both these Means or Strategies feel like dirty and manipulative ways to use the Dark Arts, but the people I talked to are immersed in a whirlpool of these without even noticing it. Also the End Goal of being able to bring my friends and family closer to philosophy or rationality at all is of major utility to me and my posterity. If splashing a drowning man might save him, I would do it.
So I was brainstorming recently with a friend about this very topic: how to convince someone to support a goal of rationality (existential risk reduction, AI study), who doesn’t enjoy engaging in rational reasoning. Like I’d love to be able to persuade my random gen pop friend to be vegetarian, or think about the real disparities in the world, or about the implications of our actions if they were scaled up.
2 possible strategies I came up with, and all border on Rhetoric, Persuasion, and I guess the Dark Arts in general are
Inspiration: “Inspire” them to value the rational process aka philosophical reasoning and evidence based reasoning. Inspiration involves realizing that something is socially virtuous, aesthetically pleasing, or has good instrumental results.
I think immersing them in a rational community and using their social reasoning values would work for many people who then have to use the tools of rational argument to get status or to see themselves as socially virtuous (like LessWrong Meet-Ups). Maybe they would realize how powerful the tools are for preventing deception or cheating at some point, and come to value the tools of rationality apart from their use in status games.
Art in the form of books or movies that encourage deep deductive reasoning from evidence that rewards the heroes might be even better for this seeing as HPMOR is seen as a big draw for the gen pop to LessWrong. Examples are HPMOR, Bill Nye the Science Guy, Cosmos, the Skeptics Guide to the Galaxy. Anything that inspires an aesthetic Love of science, reasoning, or philosophy. I’ve yet to see the Scout Mindset inspire a wave of converts (although I’d be curious to analyze LessWrong / EA Forum usage after the book was published), and my friend didn’t “covert” after listening to it. It does have useful tools for the converted though.
Finally conflating success at status, money, or relationships with rational argumentation or cohesion and correlation epistemological values. This would plant the seed of curiosity about how these tools are so effective, then people will eventually generalize these lessons, then boom. They’re unknowingly rationalists and will possibly join the community itself once they find it.
Approach from the Side: Identities are powerful. You can emphasize the virtuousness of one part of peoples’ identities as being consistent with the virtuousness of another part of their identity. Eventually you can weave these syllogisms together to support an “actually good value”. For Example, I find that every liberal and conservative pays lip service to the ideas of a ‘post-truth world’ or ‘fake news’, and this is great because one can use this piece of their identity to make them more curious about what is true. When I did this with my parents it went generally something like this.
I like A (hating on people who tell lies)
A is similar to B (liking people who tell the truth)
B is similar to C (liking scientists who tell more truth, better than people who tell less truth)
C is similar to D (I like more truth, better than less truth)
D is similar to E (I like the process that creates more truth, better than the process that creates less truth)
E is similar to …. Z (I like consistent principles that build up an accurate predictive model of the world within my mind and are very generally useful)
My approach from the side technique worked briefly with my parents, maybe for like 2 or 4hrs they were curious about evidence for their beliefs and even some epistemology. Their belief statements went back to equilibrium (total confidence, signaling, etc.) after that though, although maybe it has shifted slightly in an undiscernible way. Anyways, the brief flicker of light I saw in them still feels worth it, but it might not for you.
Both these Means or Strategies feel like dirty and manipulative ways to use the Dark Arts, but the people I talked to are immersed in a whirlpool of these without even noticing it. Also the End Goal of being able to bring my friends and family closer to philosophy or rationality at all is of major utility to me and my posterity. If splashing a drowning man might save him, I would do it.