How do you upgrade people into rationalists? In particular, I want to upgrade some younger math-inclined people into rationalists (peers at university). My current strategy is:
incidentally name drop my local rationalist meetup group, (ie. “I am going to a rationalist’s meetup on Sunday”)
link to lesswrong articles whenever relevant (rarely)
be awesome and claim that I am awesome because I am a rationalist (which neglects a bunch of other factors for why I am so awesome)
when asked, motivate rationality by indicating a whole bunch of cognitive biases, and how we don’t naturally have principles of correct reasoning, we just do what intuitively seems right
This is quite passive (other than name dropping and article linking) and mostly requires them to ask me about it first. I want something more proactive that is not straight up linking to Lesswrong, because the first thing they go to is The Simple Truth and immediately get turned off by it (The Simple Truth shouldn’t be the first post in the first sequence that you are recommended to read on Lesswrong). This has happened a number of times.
This sounds like you think of them as mooks you want to show the light of enlightenment to. The sort of clever mathy people you want probably don’t like to think of themselves as mooks who need to be shown the light of enlightenment. (This also might be sort of how I feel about the whole rationalism as a thing thing that’s going on around here.)
That said, actually being awesome for your target audience’s values of awesome is always a good idea to make them more receptive to looking into whatever you are doing. If you can use your rationalism powers to achieve stuff mathy university people appreciate, like top test scores or academic publications while you’re still an undergraduate, your soapbox might be a lot bigger all of a sudden.
Then again, it might be that rationalism powers don’t actually help enough in achieving this, and you’ll just give yourself a mental breakdown while going for them. The math-inclined folk, who would like publication writing superpowers, probably also see this as the expected result, so why should they buy into rationality without some evidence that it seems to be making people win more?
To be honest, unless they have exceptional mathematical ability or are already rationalists, I will consider them to be mooks. Of course, I wont make that apparent, it is rather hard to make friends that way. Acknowledging that you are smart is a very negative signal, so I try to be humble, which can be awkward in situations like when only two out of 13 people pass a math course that you are in, and you got an A- and the other guy got a C-.
Pretty much someone who has read the Lesswrong sequences. Otherwise, someone who is unusually well read in the right places (cognitive science, especially biases; books like Good and Real and Causality), and demonstrates that they have actually internalized those ideas and their implications.
Related question: how can I upgrade myself from someone who trolls robo-”rationalists” that think acquaintance with a particular handful of concepts, buzzwords, and habits of thought is a mark of superiority rather than just a mark of difference, to a superbeing faster than a speeding singularity who can separate P from NP in a single bound?
Rational is about how you think, not how you got there. There have been many rational people throughout history who have read approximately none of that.
I am mostly talking about epistemic rationality, not instrumental rationality. With that in mind, I wouldn’t consider anyone from a hundred years ago or earlier to be up to my epistemic standards because they simply did not have access to the requisite information, ie. cognitive science and Bayesian epistemology. There are people that figured it out in certain domains (like figuring out that the labels in your mind are not the actual things that they represent), but those people are very exceptional and I doubt that I will meet people that are capable of the pioneering, original work that these exceptional people did.
What I want are people who know about cognitive biases, understand why they are very important, and have actively tried to reduce the effects of those biases on themselves. I want people who explicitly understand the map and territory distinction. I want people who are aware of truth-seeking versus status arguments. I want people who don’t step on philosophical landmines and don’t get mindkilled. I would not expect someone to have all of these without having at least read some of Lesswrong or the above material. They might have collected some of these beliefs and mental algorithms on their own, but it is highly unlikely that they came across all of them.
Is that too much to ask? Are my standards too high? I hope not.
Eh, without adopting particularly unconventional (for this site) standards, you could reasonably say that there have been very few rational people throughout history (or none.)
There’s a reason people on this site use the phrase “I’m an aspiring rationalist.”
Taboo “rationalist”. That is, don’t make it sound like this is a group or ideology anyone is joining (because, done right, it isn’t.)
Discuss, as appropriate, cognitive biases and specific techniques. E.g. planning fallacy, “I notice I am confused”, “what do you think you know and why do you think you know it?”, confirmation bias, etc.
Tell friends about cool books you’ve read like HPMoR, Thinking Fast and Slow, Predictably Irrational, Getting Things Done, and so forth. If possible read these books in paper (not ebooks) where your friends can see what you’re reading and ask you about them.
The problem with rationality is that unless you are at some level, you don’t feel like you need to become more rational. And I think most people are not there, even the smart ones. Seems to me that smart people often realize they miss some specific knowledge, but they don’t go meta and realize that they miss knowledge-gathering and -filtering skills. (And that’s the smart people. The stupid ones only realize they miss money or food or something.) How do you sell something to a person who is not interested in buying?
Perhaps we could make a selection of LW articles that can be interesting even for people not interested in rationality. Less meta, less math. The ones that feel like “this website could help me make more money and become more popular”. Then people become interested, and perhaps then they become interested more meta—about a culture that creates this kind of articles.
(I guess that even for math-inclined people the less mathy articless would be better. Because they can find math in thousand different places; why should they care specifically about LW?)
How about bringing up specific bits of rationality when you talk with them? If they talk about plans, ask them how much they know about how long that sort of project is likely to take. If they seem to be floundering with keeping track of what they’re thinking, encourage them to write the bits and pieces down.
If any of this sort of thing seems to register, start talking about biases and/or further sources of information.
This is a hypothetical procedure—thanks for mentioning that The Simple Truth isn’t working well as an introduction.
How do you upgrade people into rationalists? In particular, I want to upgrade some younger math-inclined people into rationalists (peers at university). My current strategy is:
incidentally name drop my local rationalist meetup group, (ie. “I am going to a rationalist’s meetup on Sunday”)
link to lesswrong articles whenever relevant (rarely)
be awesome and claim that I am awesome because I am a rationalist (which neglects a bunch of other factors for why I am so awesome)
when asked, motivate rationality by indicating a whole bunch of cognitive biases, and how we don’t naturally have principles of correct reasoning, we just do what intuitively seems right
This is quite passive (other than name dropping and article linking) and mostly requires them to ask me about it first. I want something more proactive that is not straight up linking to Lesswrong, because the first thing they go to is The Simple Truth and immediately get turned off by it (The Simple Truth shouldn’t be the first post in the first sequence that you are recommended to read on Lesswrong). This has happened a number of times.
This sounds like you think of them as mooks you want to show the light of enlightenment to. The sort of clever mathy people you want probably don’t like to think of themselves as mooks who need to be shown the light of enlightenment. (This also might be sort of how I feel about the whole rationalism as a thing thing that’s going on around here.)
That said, actually being awesome for your target audience’s values of awesome is always a good idea to make them more receptive to looking into whatever you are doing. If you can use your rationalism powers to achieve stuff mathy university people appreciate, like top test scores or academic publications while you’re still an undergraduate, your soapbox might be a lot bigger all of a sudden.
Then again, it might be that rationalism powers don’t actually help enough in achieving this, and you’ll just give yourself a mental breakdown while going for them. The math-inclined folk, who would like publication writing superpowers, probably also see this as the expected result, so why should they buy into rationality without some evidence that it seems to be making people win more?
To be honest, unless they have exceptional mathematical ability or are already rationalists, I will consider them to be mooks. Of course, I wont make that apparent, it is rather hard to make friends that way. Acknowledging that you are smart is a very negative signal, so I try to be humble, which can be awkward in situations like when only two out of 13 people pass a math course that you are in, and you got an A- and the other guy got a C-.
And by the way, rationality, not rationalism.
Incidentally, what exactly makes a person already be a rationalist in this case?
Pretty much someone who has read the Lesswrong sequences. Otherwise, someone who is unusually well read in the right places (cognitive science, especially biases; books like Good and Real and Causality), and demonstrates that they have actually internalized those ideas and their implications.
Related question: how can I upgrade myself from someone who trolls robo-”rationalists” that think acquaintance with a particular handful of concepts, buzzwords, and habits of thought is a mark of superiority rather than just a mark of difference, to a superbeing faster than a speeding singularity who can separate P from NP in a single bound?
Rational is about how you think, not how you got there. There have been many rational people throughout history who have read approximately none of that.
I am mostly talking about epistemic rationality, not instrumental rationality. With that in mind, I wouldn’t consider anyone from a hundred years ago or earlier to be up to my epistemic standards because they simply did not have access to the requisite information, ie. cognitive science and Bayesian epistemology. There are people that figured it out in certain domains (like figuring out that the labels in your mind are not the actual things that they represent), but those people are very exceptional and I doubt that I will meet people that are capable of the pioneering, original work that these exceptional people did.
What I want are people who know about cognitive biases, understand why they are very important, and have actively tried to reduce the effects of those biases on themselves. I want people who explicitly understand the map and territory distinction. I want people who are aware of truth-seeking versus status arguments. I want people who don’t step on philosophical landmines and don’t get mindkilled. I would not expect someone to have all of these without having at least read some of Lesswrong or the above material. They might have collected some of these beliefs and mental algorithms on their own, but it is highly unlikely that they came across all of them.
Is that too much to ask? Are my standards too high? I hope not.
Eh, without adopting particularly unconventional (for this site) standards, you could reasonably say that there have been very few rational people throughout history (or none.)
There’s a reason people on this site use the phrase “I’m an aspiring rationalist.”
Taboo “rationalist”. That is, don’t make it sound like this is a group or ideology anyone is joining (because, done right, it isn’t.)
Discuss, as appropriate, cognitive biases and specific techniques. E.g. planning fallacy, “I notice I am confused”, “what do you think you know and why do you think you know it?”, confirmation bias, etc.
Tell friends about cool books you’ve read like HPMoR, Thinking Fast and Slow, Predictably Irrational, Getting Things Done, and so forth. If possible read these books in paper (not ebooks) where your friends can see what you’re reading and ask you about them.
The problem with rationality is that unless you are at some level, you don’t feel like you need to become more rational. And I think most people are not there, even the smart ones. Seems to me that smart people often realize they miss some specific knowledge, but they don’t go meta and realize that they miss knowledge-gathering and -filtering skills. (And that’s the smart people. The stupid ones only realize they miss money or food or something.) How do you sell something to a person who is not interested in buying?
Perhaps we could make a selection of LW articles that can be interesting even for people not interested in rationality. Less meta, less math. The ones that feel like “this website could help me make more money and become more popular”. Then people become interested, and perhaps then they become interested more meta—about a culture that creates this kind of articles.
(I guess that even for math-inclined people the less mathy articless would be better. Because they can find math in thousand different places; why should they care specifically about LW?)
As a first approximation: The Science of Winning at Life and Living Luminously.
How about bringing up specific bits of rationality when you talk with them? If they talk about plans, ask them how much they know about how long that sort of project is likely to take. If they seem to be floundering with keeping track of what they’re thinking, encourage them to write the bits and pieces down.
If any of this sort of thing seems to register, start talking about biases and/or further sources of information.
This is a hypothetical procedure—thanks for mentioning that The Simple Truth isn’t working well as an introduction.