Plant Seeds of Rationality
After his wife died, Elzéard Bouffier decided to cultivate a forest in a desolate, treeless valley. He built small dams along the side of the nearby mountain, thus creating new streams that ran down into the valley. Then, he planted one seed at a time.
After four decades of steady work, the valley throbbed with life. You could hear the buzzing of bees and the tweeting of birds. Thousands of people moved to the valley to enjoy nature at its finest. The government assumed the regrowth was a strange natural phenomenon, and the valley’s inhabitants were unaware that their happiness was due to the selfless deeds of one man.
This is The Man Who Planted Trees, a popular inspirational tale.
But it’s not just a tale. Abdul Kareem cultivated a forest on a once-desolate stretch of 32 acres along India’s West Coast, planting one seed at a time. It took him only twenty years.
Like trees in the ground, rationality does not grow in the mind overnight. Cultivating rationality requires care and persistence, and there are many obstacles. You probably won’t bring someone from average (ir)rationality to technical rationality in a fortnight. But you can plant seeds.
You can politely ask rationalist questions when someone says something irrational. Don’t forget to smile!
You can write letters to the editor of your local newspaper to correct faulty reasoning.
You can visit random blogs, find an error in reasoning, offer a polite correction, and link back to a few relevant Less Wrong posts.
One person planting seeds of rationality can make a difference, and we can do even better if we organize. An organization called Trees for the Future has helped thousands of families in thousands of villages to plant more than 50 million trees around the world. And when it comes to rationality, we can plant more seeds if we, for example, support the spread of critical thinking classes in schools.
Do you want to collaborate with others to help spread rationality on a mass scale?
You don’t even need to figure out how to do it. Just contact leaders who already know what to do, and volunteer your time and energy.
Email the Foundation for Critical Thinking and say, “How can I help?” Email Louie Helm and sign up for the Singularity Institute Volunteer Network.
Change does not happen when people gather to talk about how much they suffer from akrasia. Change happens when lots of individuals organize to make change happen.
- 25 Mar 2011 10:38 UTC; 6 points) 's comment on Constantly ask what you’re currently doing that is irrational by (
- 19 Mar 2011 17:09 UTC; 4 points) 's comment on Effective Rationality Outreach by (
- 25 Mar 2011 10:35 UTC; 4 points) 's comment on “Is there a God” for noobs by (
I mostly agree with the previous commenter who called this post “applause lights.” Reading it, many objections come to mind (some of which have already been voiced in other comments):
“Planting seeds” suggests starting a self-perpetuating chain reaction, but the post presents no evidence that any of the proposed methods are effective in this regard.
The post doesn’t draw on any existing knowledge in the area of influencing public opinion. For example, to me it seems pretty evident that public opinion normally flows only in one direction, namely down the social hierarchy of status, and any attempt to change it will be successful only insofar as it changes the influential higher-status levels. Otherwise, any success will be temporary and soon drowned. Whether or not you agree with this, any practical proposal for influencing public opinion must assume some such model, and it’s crucial not to assume a wrong one.
Similarly, there is no discussion of how to avoid making a negative contribution. For people with less than spectacular communication skills, following some of the advice in the post may result in low-status behaviors that will lower people’s opinion of the cause associated with them. (I’m not saying your advice necessarily leads to this, just that it’s easy to end up applying it that way.)
What is the exact goal of the desired change? Presumably it is to lead people to form more accurate beliefs across the board. But even assuming this can be done, is it really desirable? I don’t think this question can be answered affirmatively either at the individual or at the social level. It seems evident to me that in some significant cases one is better off not adjusting one’s beliefs towards greater accuracy, and moreover, in every human society there are widespread beliefs that wouldn’t survive rational scrutiny but nevertheless play crucial signaling and coordination roles, and it’s not at all clear whether organized society is possible without them. So what are the exact goals of “spread[ing] rationality on a mass scale,” and what argument exists that the results would be positive (by whatever measure)?
Not invariably. I’ve certainly had a substantial impact on some of my professors’ views on rationality.
Indeed, but that’s exactly the point: influencing opinion higher up the status hierarchy is more difficult and more effective. If you really care about influencing opinion, and you don’t already have an influential platform, the really difficult problem is how to come up with strategies that are both feasible and effective.
Far, far harder overall. But if you have money (or other power substitute) then the high status people are easier to buy. Not only are they already (on average more) corrupt they are also a lot better at adjusting their beliefs according to what will benefit them. That is, after all, a big contributor to getting the high status in the first place.
Besides, that flow of opinion from high to low status apes is only the default. It’s important to be aware of the impact of status on ape behaviour if you want to change things, but mostly so that we can be aware of its effect on us and then question it.
You don’t have to influence the opinions of high-status individuals: just influence a few lower-status individuals to weigh those opinions mindfully against the facts. Those then gain status among their peers for their clever and original take on things, re-distributing a little ape-status in the direction of rationality.
I don’t believe these are seeds, such actions don’t leave lasting impression that grows under its own power. A lot of energy can be spent in vain correcting specific errors in people who won’t take a hint. It might be much more effective to focus on educating people who can actually be expected to make rationality one of the guiding principles in their lives, learning more themselves than an occasional correction by others allows, and some of whom would spend energy propagating the meme.
A textbook on rationality, a rationality seminar, or advertising thereof would be seeds, but probably not arguing with random people who are wrong.
Most seeds planted don’t grow into mature trees. It takes more than a handful of seeds to get a good chance of a self sustaining community of plants, it takes extensive cultivation, or a fair amount of luck.
You can drag a horse to water, but you cannot make it study the science of liquid nutrients.
Fortunately with the right level of memetic potency a seed planted in a fertile mind may yield thirty, sixty or a hundredfold!
We must, however, acknowledge that a tendency towards rationality is to a large extent genetic. Let he who as ears to hear (and weak ability for compartmentalism and a dysfunctional sincere-hypocrisy system) let him hear!
EDIT: Drat, beaten to it with respect to allusion to prior usage. I’ll leave the comment here for the claim with respect to genetics.
Must we? What is the evidence?
Well technically not. It may not be an entirely free country but it is certainly a free universe. There is nothing saying we must arrive at any particular conclusion based off any given set of evidence. There isn’t even a rule saying that reaching correct conclusions will always be a competitive advantage. This is what allows certain kinds of non-rational thinking to be viable strategic options or viable ‘personalities’.
When asked that question I typically have a low expectation that evidence really has anything to do with it. I do not want to write a post on psychology and personality at this time so I will leave people to reach their own conclusion based on their own familiarity with psychology (or exposure to humans).
You doubt my good faith; I must doubt yours in interpreting the word “must” in a sense irrelevant to the context. I meant “must” in exactly the sense in which I interpreted you as meaning “must”: that is, that the evidence is so strong that it would be irrational not to be substantially moved by it.
Whether you choose to write anything further on the subject is up to you, but please revise your expectation of my good faith upwards.
A “tendency towards rationality” is not the same thing as IQ, nor does it resemble even slightly any of the “big five” personality traits, so any findings on the heritability of those characteristics would not be to the point. How does one even define or measure “a tendency towards rationality” to the standard required, and who has done so? If everyday observation suggests that rationality runs in families, that is insufficient to determine whether genes or upbringing were more important.
Seeing no particular reason to expect that “a tendency towards rationality is to a large extent genetic”, and seeing you assert it so strongly, I asked why.
Neither of us are acting in bad faith. I think I was fairly straight with “disagree with your implication except for the technical meaning which is an ironic segue”. You were fairly clear too. It is just the way people talk.
From that beginning the best outcome we could expect is to end up arguing about definitions of rationality or straight contradictions on whether the known correlations between cognitive traits are ‘rationality’ related. Why go there?
Couldn’t you just briefly explain your reasoning?
I like the way you asked a question there.
Studies linking common traits related to interaction with authorities with respect to beliefs.
High correlation of extraversion with conformist thinking
Relevance of both of the above to tendencies toward prioritising epistemic rationality.
Game theoretic incentives for adopting certain signalling strategies based off various social niches.
IQ: relevant.
Big Five: even more relevant. Openness to experience in particular. Extraversion is relevant via the previous mentioned conformist tendencies.
It’s about personality. Personality is overwhelmingly dominated by genetic factors.
Have you seen the children of engineers and scientists? Seriously, how is this not obvious?
Epistemic rationality is basically a mental defect. Sure, maybe not in existential terms. But certainly in the “He who dies with the most toys wins (and probably got laid more)” sense. Thinking rationally just isn’t much of a recipe for conventional success. Vulnerability to overemphasising abstract thought over primate political thought is rather closely related to tendencies towards Asperger’s. And even at the sub-diagnostic level nerds that breed are more likely to produce offspring that are diagnosable. Somewhere along that spectrum there is a maximum likelyhood of catching rationalism.
Rational thinking is nerdy. Nerdiness is heritable.
This has the makings of an excellent post.
If you could on the children of engineers/scientists thing, that’d be interesting. I don’t know how useful it’d be because I imagine it’d boil down to them being much nerdier than the children of equivalently intelligent groups, such as lawyers, Arts professors or journalists.
This would make a staggeringly excellent paper/thesis and if one were really ambitious one could also include accountants and teachers, who could be further divided by subject.
The easiest way for a current student to do this would be to try and get data on the adult children of all permanent tenured staff at their university.
I know. That’s what I was trying to avoid doing. But you could take it up. ;)
Your other ideas sound interesting too. But more in the “good subject for a PhD thesis” than good subject for a post. That would be a lot of work to acquire the data, analyse it suitably and write up all the various implications. Come to think of it it would be in the top 10 broad research areas that I’d be interested in following up. (And I would perhaps even have an ethical obligation to follow that research up with a covert ‘evil mastermind’ type eugenics program.)
For my part I am not sure how confident I could be of differences between, say, lawyer’s children and engineer’s children, after controlling for intelligence. Probably some but it’d take a whole lot of data to find significance. Journalists? Who knows. They could even be more susceptible to rationality than engineers for all I know. Potentially contrarian truth seeking is not out of place in a journalist.
I’m more comfortable with making estimates based on the groups that imply serious differences in personality. For example, marketers, ‘Desperate Housewives’ type socialites and human resources middle managers, all after controlling for IQ. Certain kinds of biased thinking are a competitive advantage in those environments.
Thanks :)
NOW: Time to dogpile you with definitional quibbles!
Um, ok then.
I agree. Doing the kind of thing that lukeprog is talking about would be more akin creating the environment where stem cells specialize into skin, bone or muscle cells. We would be creating an environment that rewards rationality, which would guide them into morphing into more rational people.
Is speculating on whether a metaphor is suitable an appropriate topic for Less Wrong?
It may be more of an OvercomingBias thing although it certainly crops up here from time to time too. We call it “Reference Class Tennis”. ;)
Or, more generally, to attend to the emotional context of the exchange. (E.g., with some people projecting confusion works better than projecting friendliness.)
You can tell people to read Harry Potter and the Methods of Rationality.
In agreement with Vladimir Nesov, these particular ‘seeds’, though cheap, seem to be not viable at all. I’ve never seen anyone take one corrected comment and turn it into a tree of rationality. Repeated correction has some effect, but its still more closely analogous to a big pile of seeds on the ground than anything that is self perpetuating. Something like a textbook would probably do better, if you could get them to read it, but that is a fairly tough step for the expected effectiveness (at least if the textbook is geared towards how to do it and not an engineered mind virus).
I’ve had this idea rolling around in the back of my head to see what an optimized ‘rationality seed’ would look like, and to see if there are any reasonably effective conversation sized seeds. No magic yet, but here are some thoughts about what it should contain:
To get them to take interest in the idea, it needs to point out some low hanging and easily visible fruit. This is person dependent and non trivial, since you basically have to bust out something important and easily explained that they haven’t already heard or thought of. Alternatively, you could point at some easily seen fruit and have them trust you that its fairly easy to get to. This seems necessary to make them think its worthwhile.
“Dark Arts” should be employed. I have no problem being motivated by emotional arguments as long as the motivations point in the same direction as my conscious thought, and have no objections to using the Dark Arts for good. I strongly suspect that the mechanisms for motivation are the same anyway- by that I mean that the information needs to go from system 1 to system 2 at some point, and its easier if you facilitate this rather than hoping they do it on their own.
It needs to point in the direction of the full answer. “Don’t trust anything you haven’t seen tested” might be an improvement on the margin, but it does not wrap around and self modify. Off the top of my head, I can’t of a good short tag line, but “before disagreeing, make sure you know why the other person thinks what they think” should get you closer- if they encounter better reasoning, they should be more likely to understand and then accept it into their own methods of thinking.
It needs to be self reinforcing and self correcting. It seems like there’s a surprisingly high rate of ‘relapsing’ into old thinking habits. You can’t just hand someone a better method of thinking and expect it to stick if it doesn’t continue to detect and cut off beginnings of relapses before they get going.
There is an organization at my university called Replant. Every year since 1991, students have participated in a massive campaign to plant trees. Last year, 1400 students were involved.
Like your suggestion of planting the seeds of rationality, this undertaking comes with pitfalls.
I’ve heard cynical/hilarious stories of Replant groups who go to the same location, several years in a row, dig up the dead trees they planted the year before, and plant new saplings in their place. The (rationalist) lesson here is that there are places where seeds won’t grow. Effort would be better spent elsewhere.
Also, as tends to happen with many in-groups, “Replant People” have acquired a reputation for being mildly self-righteous. I can see the same thing happening with rationalists trying to spread the dogma.
So, at the risk of straining the metaphor past the breaking point, planting the seeds of rationality is a great idea as long as you’ve found a nurturing environment in which to plant them, you can invest energy in guiding their maturation, and you don’t come off too smugly.
There is distinguished precedent:
This is, of course, why trees spew out a ridiculous excess of seeds, rather than spending their metabolic energy on crafting a single, perfect one.
Those are inspiring stories, but I can’t help thinking that there are more efficient ways to turn barren land into a forest than by planting one seed at a time. Something like this:
Get or make a bunch of crumbled charcoal and manure, and till those into the desolate soil. This will provide some nutrients, reduce leaching, and make the soil more hospitable to the various microbes that are essential in the formation and maintenance of healthy soils.
Plant grasses and clover, to get the soil on its way to recovery. The grass forms extensive root systems, and the clover is a particularly hardy legume, with the ability to convert atmospheric nitrogen to ammonia.
After the soil has been rehabilitated, get a bunch of various seedlings going in a greenhouse somewhere, and plant them.
Once you get the plants well-established, they’ll take over the project, and you can relax.
I figure this will be faster, and less labor-intensive, than seed-at-a-time techniques. The various reforestation and de-desertification projects around the world seem to agree.
There’s probably a metaphor in here somewhere, but I’m pretty sure it doesn’t conflict with what you actually recommend that people do.
Your “seeds” should be memes. Trees scatter a ridiculous profligacy of seeds. Not every acorn grows up to be a mighty oak, but if the oak didn’t scatter that many it wouldn’t propagate.
You know what, after reading the comments here I was quite convinced that this post wasn’t right (what with ‘you need a lot of rationality to make a dent, a single link won’t do’ and all). Then, I looked at the welcome thread. There, you’ll see the amount of newcomers that followed a random link and ended up here, then read the sequences.
Also, non-spammy relevant links are great for LW’s SEO, bringing in further traffic.
For these two reasons, I think the post, and in particular the suggestion to post links in relevant places around the web is fundamentally sound. A tastefully placed link can change somebody’s life!
I agree whole heartedly, I wound up here from such a link on a friends blog.
I wonder if the LW community could become a kind of civil and rational “Anonymous” Imagine a force devoted to raising the water level, in a polite civil tone, all over the Internet. Even taking 10-20 min a day to take one of these actions, would add up over time. A polite response that is constructive(I recall seeing a post on the topic of saying “you’re wrong” in a polite way) manner with a link to a useful article.
I created an account just to upvote this. What would be the next step? Can people who feel knowledgeable enough just...start? And let everyone know how it goes?
I don’t think this is the most pleasant way to be introduced to rationality, but irrational behavior as a teaching device has worked surprisingly often for me.
Most of the rationalists I know are self taught, and the reason they became interested in avoiding bias and fallacies in the first place is almost universally that they got tired of ridiculous arguments with someone else who was using them and began looking for rules that forced thought processes to stay a bit more sane.
Nobody I know appreciates being shown flaws in their own arguments, but if you advocate that we should stop using fireplaces on Christmas eve to protect Santa (for example), then sooner or later most people will come up with a good reason why your argument is irrational. When they come up with the rule on their own and you concede defeat, the rule is marked as a way to win arguments, not a tool other people can use to disprove my own ideas.
And once someone accepts that thoughts can be flawed and knows how to identify them, it is a hard habit to break. Each fundamental idea that helps you think rationally is slightly more easy to accept then the last one, even if actually applying them without help is still difficult.
Assume your subject has encountered common sense ever in their life. Leverage their hindsight bias and confirmation bias to give them the idea that being sensible is the assumed good idea. “People with common sense like us …” You will even be telling the truth on all levels.
Is this an actual example you used? Who would bother rebutting it?
No, its a horrible example. I would try to be a more convincing under cover rationalist.
What if you are accidentally too persuasive and you convince people of ridiculous things? Do you only use things that seem ridiculous but also seem like you could actually believe them?
I ranted once at my fundie mother about how and why taking the Bible literally constituted deliberate misreading of it and that this was evidence of bad and unclear thinking and was therefore an error of religion. I’m not sure if she was convinced at all, but she’s sure never brought up religion with me since. Note that I am in no way a Christian, in fact being completely atheist. I am still unsure if this constituted dark arts. I did, however, intend it to be a seed of the notion that joined-up thinking is not optional.
I have considered the possibility of my strategy backfiring, so I try to choose something I know they would be (relatively) certain to not accept, which also has simple ways you can show it is irrational, and something I do not actually believe (so I do not need to go back and tell them why my original opinion was probably correct after all later on) The Santa clause example might work if you were joking with a friend, but if you are in a more serious situation, something which is less obviously a trap would work better.
So far, I have not knowingly convinced anyone I know to accept an insane idea, but it is still something to watch out for.
I think that you severely underemphasize the importance of links. As has been pointed out on LW before, you have to acquire a lot of rationality before it really starts to do much for you / starts to reinforce itself over time rather than diminish in the face of external stimuli. Linking back to them is critical so the budding rationalist actually knows where to go to get more rationality instead of quickly forgetting about the incident.
I feel like this post is marginally useful, but mostly functions as an applause light and shouldn’t be on the front page.
Absolutely. This post is of a much lower standard than the other recent post by luke that he did not think worth front posting.
When I was 12, I, in my infinite wisdom, decided that eugenics was necessary to save humanity, and went online to debate my belief, where I was promptly defeated by a biologist who knew what he was talking about.
It took me a while to admit to being wrong, and I never did so publicly. Instead, I kept trying to patch the holes in my position, even as they were being exposed at an incredible rate.
Nonetheless, I regard this experience as formative in becoming a rationalist. “Planting the seed of rationality” may be successful, but you will often never have the satisfaction of knowing when it works.
Does the record of the debate still exist? For various reasons, I wouldn’t spend time advocating eugenics, but I don’t think there’s much of a biological argument against it.
It appears the parent website of the forums took them down a few years ago, so probably not.
The biological argument is that it doesn’t necessarily work, and all the societal changes I advocated with it for implementation had much stronger things against them.
The Wayback Machine might have it.
Nah, already checked—it only archived the front page of the forums. That’s actually how I found the parent website, though.
Ah, oh well.
Nah, already checked—it doesn’t archive forums. That’s actually how I found the parent website, though.
I made a conceptual jump that I’m not sure this post (or its author) intended, but that left me with a better impression of it than most people seem to be expressing.
I agree that things actions like writing a letter to the editor may have a low rate of return in bringing new persons to the cause, but I believe that they serve very well at making people who are already pro-rational in name more likely to take greater actions at a later date. E.g., I didn’t get involved in running the skeptic group at my university until well after I publicly supported skepticism and atheism in letters to the editor of my campus newspaper. That is, I think maybe the point of this post is encouraging the sort of person who is now just reading LessWrong because it is shiny to go out and start doing things instead. The world changing will come later.
See chapter 3, “Commitment and Consistency” from Influence by Robert B. Cialdini or this post on the same by Anna Salamon and Steve Rayhawk.
We’ve had too many “let’s do X” posts. I have downvoted this one and will do so for any such in the future. I will upvote posts of the form “I have done X”, where X is a valuable thing to do.
What about “I have done X, and encourage you to do X also?”
Case-by-case basis. If it’s “I did X recently / am doing X currently, come join me”, then up. If it’s “I did X once three months ago and it seemed to work, maybe we should do this more often”, then down.
Has this ever worked for you? It never has for me.
It’s not that people change their position during the conversation. It’s that a few of them realize later that they didn’t have a good reply, and they think about it.
It’s hard to measure this, but it has happened to me many times, and I like to generalize from one example.
This actually works on me lots and lots. Slowly, but it does.
It works for me frequently.
That said, we may not be trying to accomplish the same thing.
What I find it works for, for me, is developing a better understanding of what assumptions and thought processes led to the statement, which over time gives me a better sense of how to express myself so that person understands me, and how to interpret what they say.
It’s frequently worked for me. To make a lasting impact usually takes persistence and a good demeanor though.
Seem “common-sensical”. They probably had parents who had occasion to deal with them attempting something dangerously stupid when they were little.
I think this worked on me. But my success anecdotes end there. So, reason enough to keep trying it, but also good reason to look for better tactics.
Yes, but you have to know your target and the circumstances. Some statements are considered “fair game” for such analysis, because the person would care that their statement was irrational, and some aren’t.
If you’re not sure which of these cases holds, I have found asking a low stakes question to be the best way to find out.
I endorse this wholeheartedly. Spread the basic memes of basic rationality far and wide.
I am trying to do this for my best friend right now...under the pretext of ‘helping her to control her emotions better.’ Searching through LessWrong to find helpful posts...
Did you find the luminosity series yet?
I have read the luminosity posts in the past, hadn’t thought of using them for that.
Is Which Parts are Me relevant to that?
Yes, probably. Thanks.
Also (on further thought) Cached Selves and the link in Simple embodied cognition hacks.
I think, an interesting question in this context is whether rationalism is actually growing or not; today we may have more freedom to be rational (after all, some of of the smartest people of previous ages were quite irrational), but are we, as a species, actually getting better at this? It seems that common knowledge (say, that the earth revolves around the sun) has been dramatically improving, but rationalism itself? A good second question would be what would be a good, operational way to measure this...
I have a glimmer that humor can work well in this capacity.
For instance, jokes about pathological reasoning can induce the listener to consider why certain chains of logic are (horrifically) invalid (and thus humorous), and perhaps apply these lessons to their own reasoning.
(There is also the related but less pleasant phenomenon in which the listener immediately recognizes one of their own flaws as the butt of the joke (as embodied in an abstract joke-land entity as opposed to themselves), laughs in embarrassment, and decides to alter their behavior (or company) based on how their acquaintances view that character trait, opinion, etc.)
I posted an example from The Simpsons in a recent rationality quote topic because that show occasionally hits this sort of humor right on the nose—I used “They did it because they’re stupid, that’s why everybody does everything”, but a similar situation occurs when Homer’s caught in traffic and says “Don’t worry, I have a secret weapon!” before starting to honk his horn furiously.
Such stories are used by Idries Shah in many of his books for precisely this purpose.
Quite intriguing! Can you suggest any of his books in particular as a good place to start?
To suddenly discover Idries Shah is a bit like discovering the Sequences: there is so much.
I’d suggest The Magic Monastery (subtitled “Analogical and Action Philosophy of the Middle East and Central Asia”), which is a mixture of traditional teaching stories and stories of his own composition.
Then there are the books of Nasruddin stories (see the biblio at the end of the wiki article). Nasruddin is a traditional figure in stories all over the Middle East, and you can find a lot of Nasruddin stories on Google.
Shah’s books are available in the US from ISHK and in the UK from Octagon. From places like Amazon too, of course, but the publishers’ sites collect everything together more conveniently.
Hah! I’ll have to check him out. I took a course on “Islamicate” literature and philosophy semi-recently, and found Sufism very interesting. Almost did my final paper on The Conference of the Birds (though the library’s translation was ancient, called Muslims “Muselmen”). Thanks! :D
By this all will know that you are rationalists, if you have less akrasia than most.
Sharing ideas and results can lead to more effective actions.
There are various ways to do this without huge overhead, e.g. a mailing list. (LW probably counts as something with a fair bit of overhead.)
The use of this poem contributes quite a bit to the argument as it is a factual event and a future possible event.
It is a positive action to do something that will be beneficial within ones own lifetime, and also to repeat something that has been done in the past that is a current benefit.
Planting trees has the benefit of carbon sequestration and the added benefit of providing growth of known positive environmental factors such as increased biodiversity.
The negative aspect of this post is that the wording is similar to religious propaganda such as used by World Vision.