I think you might be pattern matching to straw-vulcan rationality, that’s distinct from what CFAR wants to teach.
I don’t think that’s true. In my experience spending time with rationalists and studying aspects of it myself, I have found that rationalists separate themselves from the general population in many ways which would make it hard to convince non-rationalists. Those aspects are things that rationalists cultivate partially in an effort to improve their thinking, but also, in order to signal membership in the rationalist tribe. (Rationalists are humans after all) Those are not things that rationalists can easily turn on and off. I can identify 3 general groups of aspects that many rationalists seem to have:
1) The use of esoteric language. Rationalists tend to use a lot of language that is unfamiliar to others. Rationalists “update” their beliefs. They fight “akrasia”. They “install” new habits. If you spend any time in rationalist circles, you will have heard those terms used in those ways very frequently. This is of course not bad in and of itself. But it marks one as a member of the rationalist tribe and even someone who does not know about rationalists will be able to identify the speaker who uses this terminology as alien and “weird”. My first encounter with rationalists was indeed of this type. All I knew was that they seemed to speak in a very strange manner.
2) Rationalists, at least the ones in this community, hold a variety of unusual beliefs. I actually find it hard to identify those beliefs because I hold many of them. Nonetheless, a chat with many other human beings regarding the theory of the mind, metaphysics, morality, etc will reveal gaps the size of the grand canyon between the average rationalist and the average person. Maybe at some level, there is agreement, but when it comes to object-level issues, the disagreement is immense.
3) Rationalists think very differently from the way most other people think. That is after all the point. However, it means that arguments that convince rationalists will frequently fail to convince an average person. For instance, arguing that the effects of brain damage show that there is no soul in the conventional sense will get you nowhere with an average person while many rationalists see this as a very persuasive if not conclusive argument.
I claim that to convince another human being, you must be able to model their cognitive processes. As many rationalists realize, humans have a tendency to model other humans as similar to themselves. Doing otherwise is incredibly difficult and increases in difficulty exponentially with your difference from that other human. This is after all unsurprising. If modeling an identical copy of yourself, you need only fake sensory inputs and see what the output would be. If you model someone different than yourself, you need to basically replicate their brain within your brain. This is obviously very effortful and error-prone. This is actually hard-enough that it is difficult for you to replicate the processes that led you to believe something you no longer believe. And you had access to the brain which held those now-discarded beliefs!
I do not claim it is an impossible task. But I do claim that the better you are at rationality, the worst you will be at understanding non-rationalists and how to convince them of anything. If anything, as a good rationalist, you will have learned to flinch away from lines of reasoning that are the result of common cognitive errors. But of course, cognitive errors are an integral part of the way most people live their lives. So if you flinch away from such things, you will miss lines of reasoning that would be very fruitful to convince others of the correctness of your beliefs.
Let me provide an example. I recently discussed abortion with a non-rationalist but very intelligent friend. I pointed out that within the context of fetuses being humans deserving or rights, abortion is obviously murder and that he was missing the point of his opponents. The responses I got were riddled with fallacies. Most interestingly, the idea that science has determined that fetuses are not humans. I tried to explain that science can certainly tell us what is going on at various stages of development, but that it cannot tell us what is a “human deserving of right” as that is a purely moral category. This was to no avail. People (even very intelligent people) hang their beliefs and actions of such fallacy-riddled lines of reasoning all the time. If you train yourself to avoid such lines of reasoning, you will have great difficulty in convincing others without first turning them into yourself.
My first encounter with rationalists was indeed of this type.
If I’m chatting with other rationalists I will use a term like akrasia but in other contexts I will say procastination.
I’m prefectly able to use different words in different social contexts.
In my experience spending time with rationalists and studying aspects of it myself
There are ways of studying rationality that do have those effects. I don’t think going to a CFAR workshop is going to make a person less likely to convince the average person.
I tried to explain that science can certainly tell us what is going on at various stages of development, but that it cannot tell us what is a “human deserving of right” as that is a purely moral category. This was to no avail.
Conving a person who believes in sciencism that science doesn’t work that way is similar to trying to convince a theist that there’s no god. Both are hard problems that you can’t easily solve by making emotional appeals even if you are good at making emotional appeal.
I claim that to convince another human being, you must be able to model their cognitive processes.
I don’t believe that to be true. In many cases it’s possible to convince other people by making generalized statements that different human beings will interpret differently and where it’s not important that you know which interpretation the other person chooses.
In NLP that principle is called the Milton model.
As many rationalists realize, humans have a tendency to model other humans as similar to themselves.
I think it would be more accurate to say humans have a tendency to model other humans as they believe themselves to be.
I pointed out that within the context of fetuses being humans deserving or rights, abortion is obviously murder and that he was missing the point of his opponents.
From an LW perspective I think abortion is obviously murder is an argument with little substance because it’s about the definitions of words. My reflex through ratioanlity training would be to taboo murder.
I actually find it hard to identify those beliefs because I hold many of them. Nonetheless, a chat with many other human beings regarding the theory of the mind, metaphysics, morality, etc will reveal gaps the size of the grand canyon between the average rationalist and the average person. Maybe at some level, there is agreement, but when it comes to object-level issues, the disagreement is immense.
I don’t think that the average rationalist has the same opinon on any of those subjects. There a sizeable portion of EA people in this community but not everybody agrees with the EA frame.
I had a Hemming circle at our local LW meetup where pride was very important to the circled person. He choose actions because he wanted to achive results that make him feel pride. For myself pride is no important concept or emotion. It’s not an emotion that I seek.
The Hamming cirlce allowed me to have a perspective into the workings of a mind that in that regard significantly different then myself. Hamming circles are a good way to learn to model people different than yourself.
There are people in this community who focus on analytical reasoning and as a result are bad at modelling normal people. I think those people would get both more rational and better at modeling normal people if they would frequently engage in Hamming circles.
I think the same is true for practicing techniques like goal factoring and urge propagation.
If you train Focusing you can speak from that place to make stronger emotional appeal than you could otherwise.
I don’t think that’s true. In my experience spending time with rationalists and studying aspects of it myself, I have found that rationalists separate themselves from the general population in many ways which would make it hard to convince non-rationalists. Those aspects are things that rationalists cultivate partially in an effort to improve their thinking, but also, in order to signal membership in the rationalist tribe. (Rationalists are humans after all) Those are not things that rationalists can easily turn on and off. I can identify 3 general groups of aspects that many rationalists seem to have:
1) The use of esoteric language. Rationalists tend to use a lot of language that is unfamiliar to others. Rationalists “update” their beliefs. They fight “akrasia”. They “install” new habits. If you spend any time in rationalist circles, you will have heard those terms used in those ways very frequently. This is of course not bad in and of itself. But it marks one as a member of the rationalist tribe and even someone who does not know about rationalists will be able to identify the speaker who uses this terminology as alien and “weird”. My first encounter with rationalists was indeed of this type. All I knew was that they seemed to speak in a very strange manner.
2) Rationalists, at least the ones in this community, hold a variety of unusual beliefs. I actually find it hard to identify those beliefs because I hold many of them. Nonetheless, a chat with many other human beings regarding the theory of the mind, metaphysics, morality, etc will reveal gaps the size of the grand canyon between the average rationalist and the average person. Maybe at some level, there is agreement, but when it comes to object-level issues, the disagreement is immense.
3) Rationalists think very differently from the way most other people think. That is after all the point. However, it means that arguments that convince rationalists will frequently fail to convince an average person. For instance, arguing that the effects of brain damage show that there is no soul in the conventional sense will get you nowhere with an average person while many rationalists see this as a very persuasive if not conclusive argument.
I claim that to convince another human being, you must be able to model their cognitive processes. As many rationalists realize, humans have a tendency to model other humans as similar to themselves. Doing otherwise is incredibly difficult and increases in difficulty exponentially with your difference from that other human. This is after all unsurprising. If modeling an identical copy of yourself, you need only fake sensory inputs and see what the output would be. If you model someone different than yourself, you need to basically replicate their brain within your brain. This is obviously very effortful and error-prone. This is actually hard-enough that it is difficult for you to replicate the processes that led you to believe something you no longer believe. And you had access to the brain which held those now-discarded beliefs!
I do not claim it is an impossible task. But I do claim that the better you are at rationality, the worst you will be at understanding non-rationalists and how to convince them of anything. If anything, as a good rationalist, you will have learned to flinch away from lines of reasoning that are the result of common cognitive errors. But of course, cognitive errors are an integral part of the way most people live their lives. So if you flinch away from such things, you will miss lines of reasoning that would be very fruitful to convince others of the correctness of your beliefs.
Let me provide an example. I recently discussed abortion with a non-rationalist but very intelligent friend. I pointed out that within the context of fetuses being humans deserving or rights, abortion is obviously murder and that he was missing the point of his opponents. The responses I got were riddled with fallacies. Most interestingly, the idea that science has determined that fetuses are not humans. I tried to explain that science can certainly tell us what is going on at various stages of development, but that it cannot tell us what is a “human deserving of right” as that is a purely moral category. This was to no avail. People (even very intelligent people) hang their beliefs and actions of such fallacy-riddled lines of reasoning all the time. If you train yourself to avoid such lines of reasoning, you will have great difficulty in convincing others without first turning them into yourself.
If I’m chatting with other rationalists I will use a term like akrasia but in other contexts I will say procastination. I’m prefectly able to use different words in different social contexts.
There are ways of studying rationality that do have those effects. I don’t think going to a CFAR workshop is going to make a person less likely to convince the average person.
Conving a person who believes in sciencism that science doesn’t work that way is similar to trying to convince a theist that there’s no god. Both are hard problems that you can’t easily solve by making emotional appeals even if you are good at making emotional appeal.
I don’t believe that to be true. In many cases it’s possible to convince other people by making generalized statements that different human beings will interpret differently and where it’s not important that you know which interpretation the other person chooses.
In NLP that principle is called the Milton model.
I think it would be more accurate to say humans have a tendency to model other humans as they believe themselves to be.
From an LW perspective I think
abortion is obviously murder
is an argument with little substance because it’s about the definitions of words. My reflex through ratioanlity training would be to taboomurder
.I don’t think that the average rationalist has the same opinon on any of those subjects. There a sizeable portion of EA people in this community but not everybody agrees with the EA frame.
I had a Hemming circle at our local LW meetup where
pride
was very important to the circled person. He choose actions because he wanted to achive results that make him feelpride
. For myselfpride
is no important concept or emotion. It’s not an emotion that I seek.The Hamming cirlce allowed me to have a perspective into the workings of a mind that in that regard significantly different then myself. Hamming circles are a good way to learn to model people different than yourself.
There are people in this community who focus on analytical reasoning and as a result are bad at modelling normal people. I think those people would get both more rational and better at modeling normal people if they would frequently engage in Hamming circles.
I think the same is true for practicing techniques like goal factoring and urge propagation.
If you train Focusing you can speak from that place to make stronger emotional appeal than you could otherwise.