I think that there is a general consensus that we would probably live in a better world if we raised the sanity waterline. However, most people interested in rationality, biases, Bayesian theory, decision theory, etc are people who have already a natural inclination towards these topics. I feel I have benefited enormously by reading LW, the Sequences, listening to Julia Galef’s podcast etc, but I was already interested in rationality before knowing about them. What initiatives have more potential to make rationality more popular among people without this natural inclination?
Julia Galef’s work is about getting more people interested in rationality. The Scout mindset is an aesthetic that can get more people interested in rationality. It also can be applied by people who don’t have the IQ to follow extremely complicated arguments.
The idea that it’s necessary to make more people interested in rationality is however flawed. If we would do a better job with the people in our community that are actually interested in rationality and our community would become a resource for cognitive elites that want to find important knowledge that would have a massive impact without needing to convince everyone in the world to be interested in rationality.
It’s better for the rationality community to focus on it’s internals then to focus on evangelizing. Getting our own house in order is more important then evangelizing.
I agree with the general idea that actually being awesome makes evangelizing easier, and not being awesome makes the whole awesomeness-promising project suspicious.
But there are also benefits from cooperation, or from social pressure aligned with your goals. To get these benefits, you need to have similarly-minded people around you. There are places with enough rationalists to start awesome group projects; but there are also places without them. Groups can be more productive than individuals, because of division of labor, getting sidekicks, etc.
I don’t think that the main problem when it comes to start new groups is about evangelizing. Making it more easy to run a high quality meetup seems to me to me more important then evangelizing with the idea of changing people into rationalists.
If more high quality articles get written on LessWrong those articles inherently spread. It does needs some inherent openness to new information and ways to inform yourself about the world to find high quality articles in your stream of information. The kind of people who initially came to LessWrong were people who have that kind of openess to information that they can be reached that way.
If you put real effort into evangelizing you get Eternal September problems.
So I was brainstorming recently with a friend about this very topic: how to convince someone to support a goal of rationality (existential risk reduction, AI study), who doesn’t enjoy engaging in rational reasoning. Like I’d love to be able to persuade my random gen pop friend to be vegetarian, or think about the real disparities in the world, or about the implications of our actions if they were scaled up.
2 possible strategies I came up with, and all border on Rhetoric, Persuasion, and I guess the Dark Arts in general are
Inspiration: “Inspire” them to value the rational process aka philosophical reasoning and evidence based reasoning. Inspiration involves realizing that something is socially virtuous, aesthetically pleasing, or has good instrumental results.
I think immersing them in a rational community and using their social reasoning values would work for many people who then have to use the tools of rational argument to get status or to see themselves as socially virtuous (like LessWrong Meet-Ups). Maybe they would realize how powerful the tools are for preventing deception or cheating at some point, and come to value the tools of rationality apart from their use in status games.
Art in the form of books or movies that encourage deep deductive reasoning from evidence that rewards the heroes might be even better for this seeing as HPMOR is seen as a big draw for the gen pop to LessWrong. Examples are HPMOR, Bill Nye the Science Guy, Cosmos, the Skeptics Guide to the Galaxy. Anything that inspires an aesthetic Love of science, reasoning, or philosophy. I’ve yet to see the Scout Mindset inspire a wave of converts (although I’d be curious to analyze LessWrong / EA Forum usage after the book was published), and my friend didn’t “covert” after listening to it. It does have useful tools for the converted though.
Finally conflating success at status, money, or relationships with rational argumentation or cohesion and correlation epistemological values. This would plant the seed of curiosity about how these tools are so effective, then people will eventually generalize these lessons, then boom. They’re unknowingly rationalists and will possibly join the community itself once they find it.
Approach from the Side: Identities are powerful. You can emphasize the virtuousness of one part of peoples’ identities as being consistent with the virtuousness of another part of their identity. Eventually you can weave these syllogisms together to support an “actually good value”. For Example, I find that every liberal and conservative pays lip service to the ideas of a ‘post-truth world’ or ‘fake news’, and this is great because one can use this piece of their identity to make them more curious about what is true. When I did this with my parents it went generally something like this.
I like A (hating on people who tell lies)
A is similar to B (liking people who tell the truth)
B is similar to C (liking scientists who tell more truth, better than people who tell less truth)
C is similar to D (I like more truth, better than less truth)
D is similar to E (I like the process that creates more truth, better than the process that creates less truth)
E is similar to …. Z (I like consistent principles that build up an accurate predictive model of the world within my mind and are very generally useful)
My approach from the side technique worked briefly with my parents, maybe for like 2 or 4hrs they were curious about evidence for their beliefs and even some epistemology. Their belief statements went back to equilibrium (total confidence, signaling, etc.) after that though, although maybe it has shifted slightly in an undiscernible way. Anyways, the brief flicker of light I saw in them still feels worth it, but it might not for you.
Both these Means or Strategies feel like dirty and manipulative ways to use the Dark Arts, but the people I talked to are immersed in a whirlpool of these without even noticing it. Also the End Goal of being able to bring my friends and family closer to philosophy or rationality at all is of major utility to me and my posterity. If splashing a drowning man might save him, I would do it.
What do you mean by “reach out to people”? Usually that just means contact them. But here you seem to mean something different.
You are right, I will clarify the question. Thank you!