Sometimes I’ll start a new relationship or friendship, and as this person becomes close to me I’ll want to talk about things like rationality and transhumanism and the Singularity. This hasn’t ever gone badly, as these subjects are interesting to smart people. But I think I could introduce these ideas more effectively, with a better structure, to maximize the chance that those close to me might be as interested in these topics as I am (e.g. to the point of reading or participating in OB/LW, or donating to SIAI, or attending/founding rationalist groups). It might help to present the futurist ideas in increasing order of outrageousness as described in Yudkowsky1999′s future shock levels. Has anyone else had experience with introducing new people to these strange ideas, who has any thoughts or tips on that?
Edit: for futurist topics, I’ve sometimes begun (in new relationships) by reading and discussing science fiction short stories, particularly those relating to alien minds or the Singularity.
For rationalist topics, I have no real plan. One girl really appreciated a discussion of the effect of social status on the persuasiveness of arguments; she later mentioned that she’d even told her mother about it. She also appreciated the concept of confirmation bias. She’s started reading LessWrong, but she’s not a native English speaker so it’s going to be even more difficult than LessWrong already is.
I think of LessWrong from a really, really pragmatic viewpoint: it’s like software patches for your brain to eliminate costly bugs. There was a really good illustration in the Allaismini-sequence—that is a literal example of people throwing away their money because they refused to consider how their brain might let them down.
It shows you that there is really more to most things than meets the eye, but more often than not much less than you think. It shows you that even smart people can be completely wrong but that most people are not even wrong. It tells you to be careful in what you emit and to be skeptical of what you receive. It doesn’t tell you what is right, it teaches you how to think and to become less wrong. And to do so is in your own self interest because it helps you to attain your goals, it helps you to achieve what you want. Thus what you want is to read and participate on LessWrong.
I am probably a miserable talker, as usually after my introduction of rationality/singularity related topics people tend to even strengthen their former opinions. I could well use a “good argumentation for rationality dummys” article. No, reading through all the sequences does not help. (Understanding would?)
Often enough it seems that I achieve better results by trying not to touch any “religious” topic too early; religious meaning that the argument for not having that opinion requires an understanding of reductionism and epistemology worth a third year philosophy student (btw, acceptance is also required).
This may seem to take enormous amounts of time to get people onto this train, but, well, the average IQ is 100, and getting rationality seems to be even less far spread than intelligence, so it may actually be more useful to hint in the right direction for special topics than to catch it all.
And, how does this actually help your own intentions? It seems non-trivial to me that finding a utility-function where taking the time to improve the rationality-q of a few philosophy/arts students or electricians or whatever is actually a net-win for what one can improve. Or is everybody here just hanging out with (gonna-be) scientists?
I am probably a miserable talker, as usually after my introduction of rationality/singularity related topics people tend to even strengthen their former opinions.
I’m not sure this is what you’re doing, but I’m careful not to bring up LessWrong in an actual argument. I don’t want arguments for rationality to be enemy soldiers.
Instead, I bring rationalist topics up as an interesting thing I read recently, or as an influence on why I did a certain thing a certain way, or hold a particular view (in a non-argument context). That can lead to a full-fledged pitch for LessWrong, and it’s there that I falter; I’m not sure I’m pitching with optimal effectiveness. I don’t have a good grasp on what topics are most interesting/accessible to normal (albeit smart) people.
And, how does this actually help your own intentions? It seems non-trivial to me that finding a utility-function where taking the time to improve the rationality-q of a few philosophy/arts students or electricians or whatever is actually a net-win for what one can improve. Or is everybody here just hanging out with (gonna-be) scientists?
If rationalists were so common that I could just filter people I get close to by whether they’re rationalists, I probably would. But I live in Taiwan, and I’m probably the only LessWrong reader in the country. If I want to talk to someone in person about rationality, I have to convert someone first. I like to talk about these topics, since they’re frequently on my mind, and because certain conclusions and approaches are huge wins (especially cryonics and reductionism).
the main hurdle in my experience is getting people over biases that cause them to think that the future is going to look mostly like the present. if you can get people over this then they do a lot of the remaining work for you.
How do you introduce your friends to LessWrong?
Sometimes I’ll start a new relationship or friendship, and as this person becomes close to me I’ll want to talk about things like rationality and transhumanism and the Singularity. This hasn’t ever gone badly, as these subjects are interesting to smart people. But I think I could introduce these ideas more effectively, with a better structure, to maximize the chance that those close to me might be as interested in these topics as I am (e.g. to the point of reading or participating in OB/LW, or donating to SIAI, or attending/founding rationalist groups). It might help to present the futurist ideas in increasing order of outrageousness as described in Yudkowsky1999′s future shock levels. Has anyone else had experience with introducing new people to these strange ideas, who has any thoughts or tips on that?
Edit: for futurist topics, I’ve sometimes begun (in new relationships) by reading and discussing science fiction short stories, particularly those relating to alien minds or the Singularity.
For rationalist topics, I have no real plan. One girl really appreciated a discussion of the effect of social status on the persuasiveness of arguments; she later mentioned that she’d even told her mother about it. She also appreciated the concept of confirmation bias. She’s started reading LessWrong, but she’s not a native English speaker so it’s going to be even more difficult than LessWrong already is.
I think of LessWrong from a really, really pragmatic viewpoint: it’s like software patches for your brain to eliminate costly bugs. There was a really good illustration in the Allais mini-sequence—that is a literal example of people throwing away their money because they refused to consider how their brain might let them down.
Edit: Related to The Lens That Sees Its Flaws.
It shows you that there is really more to most things than meets the eye, but more often than not much less than you think. It shows you that even smart people can be completely wrong but that most people are not even wrong. It tells you to be careful in what you emit and to be skeptical of what you receive. It doesn’t tell you what is right, it teaches you how to think and to become less wrong. And to do so is in your own self interest because it helps you to attain your goals, it helps you to achieve what you want. Thus what you want is to read and participate on LessWrong.
I am probably a miserable talker, as usually after my introduction of rationality/singularity related topics people tend to even strengthen their former opinions. I could well use a “good argumentation for rationality dummys” article. No, reading through all the sequences does not help. (Understanding would?)
Often enough it seems that I achieve better results by trying not to touch any “religious” topic too early; religious meaning that the argument for not having that opinion requires an understanding of reductionism and epistemology worth a third year philosophy student (btw, acceptance is also required).
This may seem to take enormous amounts of time to get people onto this train, but, well, the average IQ is 100, and getting rationality seems to be even less far spread than intelligence, so it may actually be more useful to hint in the right direction for special topics than to catch it all.
And, how does this actually help your own intentions? It seems non-trivial to me that finding a utility-function where taking the time to improve the rationality-q of a few philosophy/arts students or electricians or whatever is actually a net-win for what one can improve. Or is everybody here just hanging out with (gonna-be) scientists?
I’m not sure this is what you’re doing, but I’m careful not to bring up LessWrong in an actual argument. I don’t want arguments for rationality to be enemy soldiers.
Instead, I bring rationalist topics up as an interesting thing I read recently, or as an influence on why I did a certain thing a certain way, or hold a particular view (in a non-argument context). That can lead to a full-fledged pitch for LessWrong, and it’s there that I falter; I’m not sure I’m pitching with optimal effectiveness. I don’t have a good grasp on what topics are most interesting/accessible to normal (albeit smart) people.
If rationalists were so common that I could just filter people I get close to by whether they’re rationalists, I probably would. But I live in Taiwan, and I’m probably the only LessWrong reader in the country. If I want to talk to someone in person about rationality, I have to convert someone first. I like to talk about these topics, since they’re frequently on my mind, and because certain conclusions and approaches are huge wins (especially cryonics and reductionism).
the main hurdle in my experience is getting people over biases that cause them to think that the future is going to look mostly like the present. if you can get people over this then they do a lot of the remaining work for you.