“How do you interest people in rationality?” is a question I have been thinking about for a very long time. The most important insights I have into this are below.
How I crossed the first divide:
There was a sense of being expected to think for myself by my peers as a teen—the “think for yourself” mantra was a core part of our culture. This seems especially relevant because peer pressure gets through to people who aren’t rational.
After being influenced by the “think for yourself” mantra that was being repeated by the other teens, I was motivated to start observing that there were flaws in the ideas being presented to me.
Realizing that there were flaws everyplace was key. I had to realize that even the adults were wrong, even teachers could be wrong, even books could be wrong, authorities could be wrong, all of it could be wrong. I needed to see examples of incorrect information in each category before I woke up to the fact that every possible source of information could have flaws in it. I would not have believed there was a problem if I hadn’t seen it myself. Without that, I would not have been interested in the solution.
A critical aspect of this was in realizing that really important information could be wrong. I also needed to know how wrong information affected me. Not everyone is going to notice so many flaws on their own and realize the implications—especially if they haven’t developed their thinking skills very far. I was lucky to be able to do this for myself. I think a lot of people will benefit from it if you show them how the dots are connected, as they may not have been taught the skills to do it themselves.
I have observed that if you overwhelm a person with too many shocking problems at once, it’s too much for them, they go into denial and reject you entirely. If I wanted to wake a person up to the fact that that the world is full of incorrect information, that it can be found even in important places, and that they are likely to have learned a lot of incorrect information, I would use baby steps.
After I knew that there was a problem, it had to occur to me that the solution was to avoid accepting new incorrect information and to go through and correct all my existing information. For me, the idea of correcting the information was obvious. For some, it may not be (they may choose drugs, or some other escape) so if I were to present people with the problem, I would also describe the solution well enough that they felt that there was an option for them that is likely to work.
Then, there was a sense of trepidation. You don’t sound like you had this experience, but consider this: Most people grow up in a culture full of irrationality with no knowledge of logical fallacies and an underdeveloped ability to think critically… let alone any idea what Bayes’s Theorem is. They have too few defenses against irrationality, so they end up building their entire lives on this mixture of irrational beliefs and whatever facts manage to make it through. There is (for lack of a known term for this) an “information debt”—very much like software debt (for anyone who doesn’t know, software debt happens when you code your program in a way where it is so disorganized that it needs to be reprogrammed before you can build on it—this is called refactoring and it can be very time-consuming). You don’t have to be a coding genius to immediately sense that taking apart your beliefs and refactoring them is going to be a gigantic job, and that it’s going to be super complicated. Becoming a rationalist is a huge investment to anyone who has an information debt of any size. Most people weren’t lucky enough to have developed thinking skills as a child… a lot of people have this huge debt of wrong information to correct.
In addition to the sense of needing to invest a lot of time, I was afraid of what would happen if I challenged my current world view and it fell apart. What if I wasn’t able to put it all back together after I took it apart? Can that make you crazy?
To make things worse, the fact that I had never been exposed to even so much as a list of logical fallacies meant that I had NO CLUE that tools existed to help you figure out the difference between true information and false information. I felt like I was opening Pandora’s box.
In my case, the way I overcame the trepidation was in asking myself questions about what would happen if I left my world view the way it was. This resulted in more trepidation than the idea of correcting it, so I chose to make the massive investment and take the risk of making a huge mess of myself.
That’s how I became a rationalist.
I think, though, that anything you can do to reduce the sense of trepidation is a necessity. For instance, letting people know that there are powerful tools to cut through these would empower more people to choose to become rationalists.
Once I discovered logical fallacies, I found myself referring back to them after I got into an argument with someone. They make excellent self-defense weapons. I think they might become popular and serve as a positive introduction to rationality if they were presented as a solution to the problem of losing arguments. After all, it does feel pretty cool to be a logical ninja—able to win the majority of my arguments. (:
The above was my dissection of how I became a rationalist. The story version has been written up and saved for later. I’d have added it here, but I didn’t want to make my comment a billion pages long.
“How do you interest people in rationality?” is a question I have been thinking about for a very long time. The most important insights I have into this are below.
How I crossed the first divide:
There was a sense of being expected to think for myself by my peers as a teen—the “think for yourself” mantra was a core part of our culture. This seems especially relevant because peer pressure gets through to people who aren’t rational.
After being influenced by the “think for yourself” mantra that was being repeated by the other teens, I was motivated to start observing that there were flaws in the ideas being presented to me.
Realizing that there were flaws everyplace was key. I had to realize that even the adults were wrong, even teachers could be wrong, even books could be wrong, authorities could be wrong, all of it could be wrong. I needed to see examples of incorrect information in each category before I woke up to the fact that every possible source of information could have flaws in it. I would not have believed there was a problem if I hadn’t seen it myself. Without that, I would not have been interested in the solution.
A critical aspect of this was in realizing that really important information could be wrong. I also needed to know how wrong information affected me. Not everyone is going to notice so many flaws on their own and realize the implications—especially if they haven’t developed their thinking skills very far. I was lucky to be able to do this for myself. I think a lot of people will benefit from it if you show them how the dots are connected, as they may not have been taught the skills to do it themselves.
I have observed that if you overwhelm a person with too many shocking problems at once, it’s too much for them, they go into denial and reject you entirely. If I wanted to wake a person up to the fact that that the world is full of incorrect information, that it can be found even in important places, and that they are likely to have learned a lot of incorrect information, I would use baby steps.
After I knew that there was a problem, it had to occur to me that the solution was to avoid accepting new incorrect information and to go through and correct all my existing information. For me, the idea of correcting the information was obvious. For some, it may not be (they may choose drugs, or some other escape) so if I were to present people with the problem, I would also describe the solution well enough that they felt that there was an option for them that is likely to work.
Then, there was a sense of trepidation. You don’t sound like you had this experience, but consider this: Most people grow up in a culture full of irrationality with no knowledge of logical fallacies and an underdeveloped ability to think critically… let alone any idea what Bayes’s Theorem is. They have too few defenses against irrationality, so they end up building their entire lives on this mixture of irrational beliefs and whatever facts manage to make it through. There is (for lack of a known term for this) an “information debt”—very much like software debt (for anyone who doesn’t know, software debt happens when you code your program in a way where it is so disorganized that it needs to be reprogrammed before you can build on it—this is called refactoring and it can be very time-consuming). You don’t have to be a coding genius to immediately sense that taking apart your beliefs and refactoring them is going to be a gigantic job, and that it’s going to be super complicated. Becoming a rationalist is a huge investment to anyone who has an information debt of any size. Most people weren’t lucky enough to have developed thinking skills as a child… a lot of people have this huge debt of wrong information to correct.
In addition to the sense of needing to invest a lot of time, I was afraid of what would happen if I challenged my current world view and it fell apart. What if I wasn’t able to put it all back together after I took it apart? Can that make you crazy?
To make things worse, the fact that I had never been exposed to even so much as a list of logical fallacies meant that I had NO CLUE that tools existed to help you figure out the difference between true information and false information. I felt like I was opening Pandora’s box.
In my case, the way I overcame the trepidation was in asking myself questions about what would happen if I left my world view the way it was. This resulted in more trepidation than the idea of correcting it, so I chose to make the massive investment and take the risk of making a huge mess of myself.
That’s how I became a rationalist.
I think, though, that anything you can do to reduce the sense of trepidation is a necessity. For instance, letting people know that there are powerful tools to cut through these would empower more people to choose to become rationalists.
Once I discovered logical fallacies, I found myself referring back to them after I got into an argument with someone. They make excellent self-defense weapons. I think they might become popular and serve as a positive introduction to rationality if they were presented as a solution to the problem of losing arguments. After all, it does feel pretty cool to be a logical ninja—able to win the majority of my arguments. (:
The above was my dissection of how I became a rationalist. The story version has been written up and saved for later. I’d have added it here, but I didn’t want to make my comment a billion pages long.