I explicitly just read a handful of Gleb’s articles. Prior to this I have just avoided getting in his way (virtue of silence—avoid reading means avoiding being critical and avoid judging someone who is trying to make progress)
I don’t like any of them. I find the quality of the rationality to be weak; I find the prose to be varying degrees of spider-creepy (although not as bad as OrphanWilde finds things). If I had a button that I could push to make these go away today I would. I would also be disheartened if Gleb stopped trying to do what he is trying to do. (this is a summary of my experiences with these articles. I can break them down but that would take longer to do)
I believe in spreading rationality; I just need the material to pass my bullshit meters and preferably be right up there as Bulletproof if it can be done. Unfortunately the process of generating material is literally hard work that I want to not do (for the most part), and I expect other people also want to avoid doing hard work. (I sometimes do hard work, and sometimes find work-arounds for doing it anyway, but it’s still hard. If rationality were easy/automatic; more would already be doing it)
Hopefully this adds volume to the side of the discussion opposing Gleb’s work so far; without sounding like it’s attacking...
Something said earlier:
[an article Gleb wrote...] was shared over 2K times on social media, so it probably had views in the tens of thousands if not hundreds. Then, over 1K people visited the Intentional Insights website directly from the Lifehack website.
I wanted to add that this is a pretty low number for clickbait. almost worth considering a “failed clickbait” to me.
If rationality is ready to outreach it should be doing it in an as bulletproof way as possible.
Why?
Now that we know that Newtonian physics was wrong, and Einstein was right, would you support my project to build a time machine, travel to the past, and assassinate Newton? I mean, it would prevent incorrect physics from being spread around. It would make Einstein’s theory more acceptable later; no one would criticize him for being different from Newton.
Okay, I don’t really know how to build a time machine. Maybe we could just go burn some elementary-school textbooks, because they often contain too simplified information. Sometimes with silly pictures!
Seems to me that I often see the sentiment that we should raise people from some imaginary level 1 directly to level 3, without going through level 2 first, because… well, because level 3 is better than level 2, obviously. And if those people perhaps can’t make the jump, I guess they simply were not meant to be helped.
This is why I wrote about “the low-hanging fruit that most rationalists wouldn’t even touch for… let’s admit it… status reasons”. We are (or imagine ourselves to be) at level 3, and all levels below us are equally deplorable. Helping someone else to get on level 3, that’s a worthy endeavor. Helping people get from level 1 to level 2, that’s just pathetic, because the whole level 2 is pathetic. Even if we could do that at a fraction of the cost.
Maybe that’s true when building a superhuman artificial intelligence (better getting it hundred years later than getting it wrong), but it doesn’t apply for most areas of human life. Usually, an improvement is an improvement, even when it’s not perfect.
Making all people rationalists could be totally awesome. But making many stupid people slightly less stupid, that’s also useful.
Let’s start with a false statement from one of Gleb’s articles:
Intuitively, we feel our mind to be a cohesive whole, and perceive ourselves as intentional and rational thinkers. Yet cognitive science research shows that in reality, the intentional part of our mind is like a little rider on top of a huge elephant of emotions and intuitions. This is why researchers frequently divide our mental processes into two different systems of dealing with information, the intentional system and the autopilot system.
What’s false? Researchers don’t use the terms “intentional system” and “autopilot system”.
Why is that the problem? Aren’t the terms near enough to system I and system II?
A person who’s interested might want to read additional literature on the subject. The fact that the terms Gleb invented don’t match with the existing literature means that it’s harder for a person to go from reading Gleb articles to reading higher level material.
If the person digs deeper they will sooner or later run into trouble. The might have a conversation with a genuine neuroscientist and talk about the “intentional system” and “autopilot system” and find that the neuroscientist hasn’t heard of making the distinction in those terms.
It might take a while till they understand that deception happened but it might hinder them from propressing.
I think talking about system I and system II in the way Gleb does raises the risk of readers coming a way with believing that reflective thinking is superior to intuitive thinking. It suggests that it’s about using system II for important issues instead of focusing on aligning system I and system II with each other the way CFAR proposes.
The stereotype of people who categorically prefer system II to system I is straw-vulcan’s. Level 2 of rationality is not “being a straw-vulcan”.
In the article on his website Gleb says:
The intentional system reflects our rational thinking, and centers around the prefrontal cortex, the part of the brain that evolved more recently.
That sounds to me like neurobabble. Kahnmann doesn’t say that system II is about a specific part of the brain.
Even if it would be completely true, having that knowledge doesn’t help a person to be more rational. If you want to make a message as simple as possible you could drop that piece of information without any problem.
Why doesn’t he drop it and make the article simpler? Because it helps with pushing an ideology. What other people in this thread called rationality as religion. The rationality that fills someone sense of belong to a group.
I don’t see that people rationality get’s raised in the process of that.
That leads to the question of “what are the basics of rationality?”
I think the facebook group provides sometimes a good venue to understand what new people get wrong. Yesterday one person accused another of being a fake account. I asked the accuser for his credence but he replied that he can’t give a probability for something like that. The accuser didn’t thought in terms of Cromwell’s rule.
Making that step from thinking “you are a fake account” to having a mental category of “80% certainty: you are a fake account” is progress. No neuroscience is needed to make that progress.
Rationality for beginners could attempt to teach Cromwell’s rule while keeping it as simple as possible. I’m even okay if the term Cromwell’s rule doesn’t appear. The article can have pretty pictures, but it shouldn’t make any false claims.
I admit that “What are the basics of rationality?” isn’t an easy question. This community often complicates things.
Scott recently wrote what developmental milestones are you missing. That article list 4 milestones with one of them being Cromwell’s rule (Scott doesn’t name it).
In my current view of rationality other basics might be TAPs, noticing, tiny habits, “how not to be a straw-vulcan” and “have conversation with the goal of learning something new yourself, instead of having the goal of just effecting the other person”.
A good way to searching for basics might also be to notice events where you yourself go: “Why doesn’t this other person get how the world works, X is obvious to people at LW, why to I have to suffer from living in a world where people don’t get X?”.
I don’t think the answer to that question will be that people think that the prefrontal cortex is about system II thinking.
I agree with much of this, but that quote isn’t a false claim. It does not (quite) say that researchers use the terms “intentional system” and “autopilot system”, which seem like sensible English descriptions if for some bizarre reason you can’t use the shorter names. Now, I don’t know why anyone would avoid the scholarly names when for once those make sense—but I’ve also never tried to write an article for Lifehack.
What is your credence for the explanation you give, considering that eg the audience may remember reading about many poorly-supported systems with levels numbered I and II—seeing a difference between that and the recognition that humans evolved may be easier for some then evaluating journal citations.
which seem like sensible English descriptions if for some bizarre reason you can’t use the shorter names
The motivation of Kahnmann to use system I and system II isn’t to have shorter names. It’s that there are existing conceptions among people about words describing mental concepts and he doesn’t want to use them.
Wikipedia list from Kahnmann:
In the book’s first section, Kahneman describes two different ways the brain forms thoughts:
System 1: Fast, automatic, frequent, emotional, stereotypic, subconscious System 2: Slow, effortful, infrequent, logical, calculating, conscious
Emotional/logical is a different distinction then intentional/autopilot. Trained people can shut on and off emotions via their intentions and the process has little to do with being logical or calculating.
But even given them new names that scientists don’t give them might be a valid move. If you how even do that then you should be open about the fact that you invented new names.
Given science public nature I also think that you should be open about why you choose certain terms and choosing new terms should come with an explanation of why you prefer them over alternatives.
The reason shouldn’t be that your organisation is named “intentional insights” and that’s why you call it the “intentional system”. Again that pattern leads to the rationality is about using system II instead of system I position with differ from the CFAR position.
In Gleb’s own summary of Thinking Fast and slow he writes:
System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings.
Given that in Kahnmann’s framework intentions are generated by system I, calling system II the “intentional system” produces problems.
What is your credence for the explanation you give,
Explanations don’t have credence, predictions do. If you specify a prediction I can give you my credence for it.
Maybe we could just go burn some elementary-school textbooks, because they often contain too simplified information. Sometimes with silly pictures!
Did you ever read about Feynman’s experience reading science textbooks for elementary school? (It’s available online here.)
There are good and bad ways to simplify.
This is why I wrote about “the low-hanging fruit that most rationalists wouldn’t even touch for… let’s admit it… status reasons”.
Sure, there are people I’d rather not join the LessWrong community for status reasons. But I don’t think the resistance here is about status instead of methodology. Yes, it would be nice to have organizations devoted to helping people get from level 1 to level 2, but if you were closing your eyes and designing such an organization, would it look like this?
(Both agreeing with and refining your position, and directed less to you than the audience):
Personally, I’m at level 21, and I’m trying to raise the rest of you to my level.
Now, before you take that as a serious statement, ask yourself how you feel about that proposition, and how inclined you would be to take anything I said seriously if I actually believed that. Think about to what extent I behave like I -do- believe that, and how that changes the way what I say is perceived.
http://lesswrong.com/lw/m70/visions_and_mirages_the_sunk_cost_dilemma/ ← This post, and pretty much all of my comments, had reasonably high upvotes before I revealed what I was up to. Now, I’m not going to say it didn’t deserve to get downvoted—I learned a lot from that post that I should have known going into it—but I’d like to point out the fundamental similarities, but scaled up a level, between what I do there, and typical rationalist “education”. “Here’s a thing. It was a trick! Look at how easily I tricked you! You should now listen to what I say about how to avoid getting tricked in the future.” Worse, cognitive dissonance will make it harder to fix that weakness in the future. As I said, I learned a -lot- in that post; I tried to shove at least four levels of plots and education into it, and instead, turned people off with the first or second one. I hope I taught people something, but in retrospect, and far removed from it, I think it was probably a complete and total failure which mostly served to alienate people from the lessons I was attempted to impart.
The first step to making stupid people slightly less stupid is to make them realize the way in which they’re stupid in the first place, so that they become willing to fix it. But you can’t do that, because, obviously, people really dislike being told they’re stupid. Because there are some issues inherent in approaching other people with the assumption that they’re less than you, and that they should accept your help in raising them up. You’re asserting a higher status than them. They’re going to resent that, and cognitive dissonance is going to make them decide that the thing you’re better at, either you aren’t, or that it isn’t that important. So if you think that you can make “stupid people slightly less stupid”, you’re completely incompetent at the task.
But… show them that -you- are stupid, and show them you becoming less stupid, and cognitive dissonance will tell them that they were smarter than you, and that they already knew what you were trying to teach them. That’s a huge part of what made the Sequences so successful—riddled throughout it were admissions of Eliezer’s own weakness. “This is a mistake I made. This is what I realized. This is how I started to get past that mistake.” What made them failures, however, is the way they made those who read them feel Enlightened, like they had just Leveled Up twenty times and were now far above ordinary plebeians. The critical failure of the Sequences is that they didn’t teach humility; the lesson you -should- come away from them with is the idea that, however much Less Wrong you’ve become, you’re still deeply, deeply wrong. And that’s okay.
Which provokes a dilemma. Everybody who wants to teach rationality to others, because it leveled them up twenty times and look at those stupid people falling prey to the non-central fallacy on a constant basis, are completely unsuitable to do so.
This is a pretty confusing point. I have plenty of articles where I admit my failures and discuss how I learned to succeed.
Secondly, I have only started publishing on Lifehacker—published 3 so far—and my articles way outperform the average of being shared under 1K. This is the average for experienced and non-experienced writers alike. My articles have all been shared over 1K times, and some twice as much if not more. The fact that they are shared so widely is demonstrable evidence that I understand my audience and engage it well.
BTW, curious if any of these discussions have caused you to update on any of your claims to any extent?
I now assign negligible odds to the possibility that you’re a sociopath (used as a shorthand for any of a number of hostile personality disorders) masquerading as a normal person masquerading as a sociopath, and somewhat lower odds on you being a sociopath outright, with the majority of assigned probability concentrating on “normal person masquerading as sociopath” now. (Whether that’s how you would describe what you do or not, that’s how I would describe it, because the way you write lights up my “Predator” alarm board like a nearby nuke lights up a “Check Engine” light.)
The fact that they are shared so widely is demonstrable evidence that I understand my audience and engage it well.
Demonstrable evidence that you do so better than average isn’t the same as demonstrable evidence that you do so well.
Thanks for sharing about your updating! I am indeed a normal person, and have to put a lot of effort into this style of writing for the sake of what I perceive as a beneficial outcome.
I personally have updated away from you trolling me and see you as more engaged in a genuine debate and discussion. I see we have vastly different views on the methods of getting there, but we do seem to have broadly shared goals.
Fair enough on different interpretations of the word “well.” As I said, my articles have done twice as well as the average for Lifehack articles, so we can both agree that it is demonstrable evidence of a significant and above-average level of competency on an area where I am just starting − 3 articles so far—although the term “well” is more fuzzy.
The critical failure of the Sequences is that they didn’t teach humility; the lesson you -should- come away from them with is the idea that, however much Less Wrong you’ve become, you’re still deeply, deeply wrong.
Mmm. I typically dislike framings where A teaches B, instead of framings where B learns from A.
The Sequences certainly tried to teach humility, and some of us learned humility from The Sequences. I mean, it’s right there in the name that one is trying to asymptotically remove wrongness.
The main failing, if you want to put it that way, is that this is an online text and discussion forum, rather than a dojo. Eliezer doesn’t give people gold stars that say “yep, you got the humility part down,” and unsurprisingly people are not as good at determining that themselves as they’d like to be.
Mmm. I typically dislike framings where A teaches B, instead of framings where B learns from A.
Then perhaps you’ve framed the problem you’re trying to solve in this thread wrong. [ETA: Whoops. Thought I was talking to Villiam. This makes less-than-sense directed to you.]
The Sequences certainly tried to teach humility, and some of us learned humility from The Sequences. I mean, it’s right there in the name that one is trying to asymptotically remove wrongness.
I don’t think that humility can be taught in this sense, only earned through making crucial mistakes, over and over again. Eliezer learned humility through making mistakes, mistakes he learned from; the practice of teaching rationality is the practice of having students skip those mistakes.
The main failing, if you want to put it that way, is that this is an online text and discussion forum, rather than a dojo. Eliezer doesn’t give people gold stars that say “yep, you got the humility part down,” and unsurprisingly people are not as good at determining that themselves as they’d like to be.
Then perhaps you’ve framed the problem you’re trying to solve in this thread wrong.
Oh, I definitely agree with you that trying to teach rationality to others to fix them, instead of providing a resource for interested people to learn rationality, is deeply mistaken. Where I disagree with you is the (implicit?) claim that the Sequences were written to teach instead of being a resource for learning.
I don’t think that humility can be taught in this sense, only earned through making crucial mistakes, over and over again.
Mmm. I favor Bismarck on this front. It certainly helps if the mistakes are yours, but they don’t have to be. I also think it helps to emphasize the possibility of learning sooner rather than later; to abort mistakes as soon as they’re noticed, rather than when it’s no longer possible to maintain them.
Ah! My apologies. Thought I was talking to Villiam. My responses may have made less than perfect sense.
I favor Bismarck on this front. It certainly helps if the mistakes are yours, but they don’t have to be.
You can learn from mistakes, but you don’t learn what it feels like to make mistakes (which is to say, exactly the same as making the right decision).
I also think it helps to emphasize the possibility of learning sooner rather than later; to abort mistakes as soon as they’re noticed, rather than when it’s no longer possible to maintain them.
That’s where humility is important, and where the experience of having made mistakes helps. Making mistakes doesn’t feel any different from not making mistakes. There’s a sense that I wouldn’t make that mistake, once warned about it—and thinking you won’t make a mistake is itself a mistake, quite obviously. Less obviously, thinking you will make mistakes, but that you’ll necessarily notice them, is also a mistake.
I address the concerns about the writing style and content in my just-written comment here. Let me know your thoughts about whether that helps address your concerns.
Regarding clickbait and sharing, let’s actually evaluate the baseline. I want to highlight that 2K is quite a bit higher than the average for a Lifehack article. A typical article does not rise above 1K, and that’s considered pretty good. So my articles have done really well by comparison to other Lifehack articles. Since that’s the baseline, I’m pretty happy with where the sharing is.
Why would you be disheartened if I stopped what I was trying to do?
EDIT: Also forgot to add that some of the articles you listed were not written by me but by another aspiring rationalist, so FYI.
7 Surprising Science-Based Hacks To Build Your Willpower
Tempted by that second doughnut? Struggling to resist checking your phone? Shopping impulsively on Amazon? Slacking off by reading BuzzFeed instead of doing work? What you need is more willpower! Recent research shows that strengthening willpower is the real secret to the kind of self-control that can help you resist temptations and achieve your goals. The great news is that scientists say strengthening your willpower is not as hard as you might think. Here are 7 research-based hacks to strengthen your willpower!
Smile :-)
Smiling and other mood-lifting activities help improve willpower. In a recent study, scientists first drained the willpower of participants through having them resist temptation. Then, for one group, they took steps to lift people’s moods, such as giving them unexpected gifts or showing them a funny video. For another group, they just let them rest. Compared to people who just rested for a brief period, those whose moods were improved did significantly better in resisting temptation later! So next time you need to resist temptation, improve your mood! Smile or laugh, watch a funny video or two.
low willpower resisting BuzzFeed and doughnuts? You should improve your mood—try some Buzzfeed or doughnuts.
Clench Your Fist
Clench your fists or partake in another type of activity where you exercise self-control. Studies say that exercising self-control in any physical domain causes you to become more disciplined in other facets of life. So do whatever works for you to exercise self-control when you are trying to fight temptations: clench your fist, squeeze your eyes shut, or you can even hold in your pee, just like UK Prime Minister David Cameron.
Meditate
Photo Credit: Gleb Tsipursky meditating in the park
Meditation is great for a lot of things – reducing stress, increasing focus, managing emotions. Now research suggests it even helps us build willpower! With all these benefits, can you afford not to meditate? An easy way to get started is to spend 10 minutes a day sitting in a calm position and focusing on your breath.
Reminders
Our immediate desires to give in to temptations make it really challenging to resist them. Our emotional desires seem like a huge elephant and our rational self is like a small elephant rider by comparison. However, one way to steer the elephant is to set in physical reminders in advance to remind ourselves of what our rational self wanted to do. So put a note on your fridge that says “only one doughnut” or set an alarm clock to buzz when you want to stop playing video games.
Eat
Did you know that your willpower is powered by food? No wonder’s it’s so hard to diet! When we don’t eat, our willpower goes down the drain. The best cure is a meal rich in protein, which enables the most optimal willpower.
Self-Forgiveness
How is self-forgiveness connected to willpower? Well, what the science shows is that feelings of regret deplete your willpower. This is why those who eat a little too much ice cream and feel regret are then much more likely to just let themselves go and eat the whole pint or even gallon! Instead, when you give in to temptation, be compassionate toward yourself and forgive yourself. That way, you’ll have more willpower going forward!
Commitment
The most important thing to strengthen your willpower is commitment to doing so! Only by committing to improving your willpower every day will you be able to take the steps described above. To do so, evaluate your situation and why you want to strengthen your willpower, make a clear decision to work on improving this area, and set a long-term goal for your willpower improvement to have the kind of intentional life that you want.
Then break down this goal into specific and concrete steps that you will take based on the strategies described above. Research shows this is the best path for you to build your willpower!
So what are the specific and concrete steps that you will take to build your own willpower? Share your planned steps and the strategies that you will use in the comments section below!
To avoid missing out on content that helps you reach your goals, subscribe to the Intentional Insights monthly newsletter.
The generosity of readers like you made this article possible. If you benefited from reading it, please consider volunteering or/and making a tax-deductible contribution to Intentional Insights. Thank you for being awesome!
surprising
clickbait title. That’s fine.
science-Based
No. Bad. Not helpful.
Tempted by that second doughnut?
short term reward vs long term dieting goal with less salient rewards. You didn’t explain that.
Struggling to resist checking your phone?
how about asking why? rather than attracting someone to ring the yes that’s me bells in their head; instead attract them to the I should sort that out bells.
Shopping impulsively on Amazon?
What really? I know these are just your examples; but you should use solid ones. And don’t name drop Amazon and BuzzFeed.
Slacking off by reading BuzzFeed instead of doing work? What you need is more willpower! Recent research shows
research from who? the magical scientists
that strengthening willpower is the real secret
real secret kill me now.
to the kind of self-control that can help you resist temptations and achieve your goals. The great news is that scientists say
scientists say are they the same ones that were doing research before? Or different ones.
strengthening your willpower is not as hard as you might think. Here are 7 research-based hacks to strengthen your willpower!
Smile :-)
not inherently bad a suggestion; but really has not a lot to do with willpower. If you are a human who never smiles; you shouldn’t be looking at willpower things; you should be solving that problem first.
Smiling and other mood-lifting activities help improve willpower. In a recent study,
a recent study not linked to
scientists them again!
first drained the willpower of participants through having them resist temptation. Then, for one group, they took steps to lift people’s moods, such as giving them unexpected gifts or showing them a funny video. For another group, they just let them rest. Compared to people who just rested for a brief period, those whose moods were improved did significantly better in resisting temptation later!
in what way does this connect smiling and willpower?
So next time you need to resist temptation, improve your mood! Smile or laugh, watch a funny video or two.
no. This is reaching not-even-wrong territory.
Clench Your Fist
nothing wrong with this suggestion but it’s a bit of a weak effect.
Clench your fists or partake in another type of activity where you exercise self-control. Studies say
studies say Which ones? Where?
that exercising self-control in any physical domain causes you to become more disciplined in other facets of life.
hey wait—the nebulous idea of willpower, the draw of the science based, the entire idea that there are super secret answers is entirely the opposite of what rationality wants to convey. Truth is—if there were any ideas that really worked; you would already know about them; and probably already be using them. The entire idea of maybe if I keep searching for ideas I can uncover a secret truth is wrong. It’s an overstretch of exploration in the exploration-exploitation dilemma. The worst part is partial reinforcement helps to reinforce addictive behaviours (like endlessly browsing buzzfeed) more than anything else.
So do whatever works for you to exercise self-control when you are trying to fight temptations: clench your fist, squeeze your eyes shut, or you can even hold in your pee
just like UK Prime Minister David Cameron. What? Not a good thing to be referencing.
I’m gonna stop because this feels too much like a waste of time.
I realise it’s easier to criticise than generate content; I plan to try to do that in contrast to this article if I get the time.
So next time you need to resist temptation, improve your mood! Smile or laugh, watch a funny video or two.
no. This is reaching not-even-wrong territory.
On what basis? It matches my experience, something similar has been discussed on LW before, and it would seem to match various theoretical considerations about human psychology.
Truth is—if there were any ideas that really worked; you would already know about them; and probably already be using them.
This seems like a very strong and mostly unjustified claim.
E.g. even something like the Getting Things Done system, which works very well for lots and lots of people and has been covered and promoted in numerous places, is still something that’s relatively unknown outside the kinds of circles where people are interested in this kind of thing. A lot of people in the industrialized world could benefit from it, but most haven’t even tried it.
Ideas spread slowly, and often at a rate that’s only weakly correlated with their usefulness.
Given that you didn’t address the following, let me address it
Did you know that your willpower is powered by food? No wonder’s it’s so hard to diet! When we don’t eat, our willpower goes down the drain. The best cure is a meal rich in protein, which enables the most optimal willpower.
To be that raises a harmful untruth flag.
Roy Baumeister suggests that meals help with willpower through glucose. To me the claim that it’s protein that builds willpower looks unsubstantiated. It certainly not backed up by the Israeli judges.
Where does the harm come into play? I understand the nutritional consensus to be that most people eat meals with too much protein. Nutrition science is often wrong, but that means that one should be careful about giving people the advice of raising the protein content of their meals.
The nutritional consensus is also not about optimizing willpower. I would be somewhat skeptical of the claim that the willpower optimizing meal just luckily happens to be identical to the health optimizing meal.
My argument is about two issues: 1) There no reason to belief that protein increases willpower. 2) If you tell people a lie to make them improve their diet it’s at least defensible they end of healthier as a result. If your lie however makes them eat a less healthy diet you really screwed up.
Apart from that, I don’t believe that eating glucose directly to increase your willpower is a good idea or healthy.
I speak in the tone of listicle articles reluctantly, as I wrote above. It’s icky, but necessary to get this past the editors at Lifehack and elsewhere.
a recent study not linked to
Actually, it is linked to. You can check out the article for the link, but here is the link itself if you’re curious: www.albany.edu/~muraven/publications/promotion files/articles/tice et al, 2007.pdf
Smile or laugh, watch a funny video or two.
no. This is reaching not-even-wrong territory.
This I just don’t get. If experiments say you should watch a funny video, and they do as the link above states, why is this not-even wrong territory?
In writing this I considered the virtue of silence, and decided to voice something explicitly.
If rationality is ready to outreach it should be doing it in an as bulletproof way as possible.
Before today I hadn’t read deeply into the articles published by Gleb. Owing to this comment:
http://lesswrong.com/lw/mze/marketing_rationality/cwki
and
http://lesswrong.com/lw/mz4/link_lifehack_article_promoting_lesswrong/cw8n
I explicitly just read a handful of Gleb’s articles. Prior to this I have just avoided getting in his way (virtue of silence—avoid reading means avoiding being critical and avoid judging someone who is trying to make progress)
These here (to be clear):
http://intentionalinsights.org/7-surprising-science-based-hacks-to-build-your-willpower
http://intentionalinsights.org/take-the-outside-view
http://intentionalinsights.org/the-easiest-trick-to-breaking-out-of-wrong-ideas
http://intentionalinsights.org/is-kevin-mccarthy-the-smartest-man-in-washington
http://intentionalinsights.org/are-we-wired-for-trump
http://intentionalinsights.org/you-can-predict-the-future
http://intentionalinsights.org/meaning-and-purpose-quantified-and-customized
http://intentionalinsights.org/what-do-politics-have-to-do-with-meaning-and-purpose-in-life
I don’t like any of them. I find the quality of the rationality to be weak; I find the prose to be varying degrees of spider-creepy (although not as bad as OrphanWilde finds things). If I had a button that I could push to make these go away today I would. I would also be disheartened if Gleb stopped trying to do what he is trying to do. (this is a summary of my experiences with these articles. I can break them down but that would take longer to do)
I believe in spreading rationality; I just need the material to pass my bullshit meters and preferably be right up there as Bulletproof if it can be done. Unfortunately the process of generating material is literally hard work that I want to not do (for the most part), and I expect other people also want to avoid doing hard work. (I sometimes do hard work, and sometimes find work-arounds for doing it anyway, but it’s still hard. If rationality were easy/automatic; more would already be doing it)
Hopefully this adds volume to the side of the discussion opposing Gleb’s work so far; without sounding like it’s attacking...
Something said earlier:
I wanted to add that this is a pretty low number for clickbait. almost worth considering a “failed clickbait” to me.
Why?
Now that we know that Newtonian physics was wrong, and Einstein was right, would you support my project to build a time machine, travel to the past, and assassinate Newton? I mean, it would prevent incorrect physics from being spread around. It would make Einstein’s theory more acceptable later; no one would criticize him for being different from Newton.
Okay, I don’t really know how to build a time machine. Maybe we could just go burn some elementary-school textbooks, because they often contain too simplified information. Sometimes with silly pictures!
Seems to me that I often see the sentiment that we should raise people from some imaginary level 1 directly to level 3, without going through level 2 first, because… well, because level 3 is better than level 2, obviously. And if those people perhaps can’t make the jump, I guess they simply were not meant to be helped.
This is why I wrote about “the low-hanging fruit that most rationalists wouldn’t even touch for… let’s admit it… status reasons”. We are (or imagine ourselves to be) at level 3, and all levels below us are equally deplorable. Helping someone else to get on level 3, that’s a worthy endeavor. Helping people get from level 1 to level 2, that’s just pathetic, because the whole level 2 is pathetic. Even if we could do that at a fraction of the cost.
Maybe that’s true when building a superhuman artificial intelligence (better getting it hundred years later than getting it wrong), but it doesn’t apply for most areas of human life. Usually, an improvement is an improvement, even when it’s not perfect.
Making all people rationalists could be totally awesome. But making many stupid people slightly less stupid, that’s also useful.
Let’s start with a false statement from one of Gleb’s articles:
What’s false? Researchers don’t use the terms “intentional system” and “autopilot system”.
Why is that the problem? Aren’t the terms near enough to system I and system II? A person who’s interested might want to read additional literature on the subject. The fact that the terms Gleb invented don’t match with the existing literature means that it’s harder for a person to go from reading Gleb articles to reading higher level material.
If the person digs deeper they will sooner or later run into trouble. The might have a conversation with a genuine neuroscientist and talk about the “intentional system” and “autopilot system” and find that the neuroscientist hasn’t heard of making the distinction in those terms. It might take a while till they understand that deception happened but it might hinder them from propressing.
I think talking about system I and system II in the way Gleb does raises the risk of readers coming a way with believing that reflective thinking is superior to intuitive thinking. It suggests that it’s about using system II for important issues instead of focusing on aligning system I and system II with each other the way CFAR proposes. The stereotype of people who categorically prefer system II to system I is straw-vulcan’s. Level 2 of rationality is not “being a straw-vulcan”.
In the article on his website Gleb says:
That sounds to me like neurobabble. Kahnmann doesn’t say that system II is about a specific part of the brain. Even if it would be completely true, having that knowledge doesn’t help a person to be more rational. If you want to make a message as simple as possible you could drop that piece of information without any problem.
Why doesn’t he drop it and make the article simpler? Because it helps with pushing an ideology. What other people in this thread called rationality as religion. The rationality that fills someone sense of belong to a group.
I don’t see that people rationality get’s raised in the process of that. That leads to the question of “what are the basics of rationality?”
I think the facebook group provides sometimes a good venue to understand what new people get wrong. Yesterday one person accused another of being a fake account. I asked the accuser for his credence but he replied that he can’t give a probability for something like that. The accuser didn’t thought in terms of Cromwell’s rule. Making that step from thinking “you are a fake account” to having a mental category of “80% certainty: you are a fake account” is progress. No neuroscience is needed to make that progress.
Rationality for beginners could attempt to teach Cromwell’s rule while keeping it as simple as possible. I’m even okay if the term Cromwell’s rule doesn’t appear. The article can have pretty pictures, but it shouldn’t make any false claims.
I admit that “What are the basics of rationality?” isn’t an easy question. This community often complicates things. Scott recently wrote what developmental milestones are you missing. That article list 4 milestones with one of them being Cromwell’s rule (Scott doesn’t name it).
In my current view of rationality other basics might be TAPs, noticing, tiny habits, “how not to be a straw-vulcan” and “have conversation with the goal of learning something new yourself, instead of having the goal of just effecting the other person”.
A good way to searching for basics might also be to notice events where you yourself go: “Why doesn’t this other person get how the world works, X is obvious to people at LW, why to I have to suffer from living in a world where people don’t get X?”. I don’t think the answer to that question will be that people think that the prefrontal cortex is about system II thinking.
I agree with much of this, but that quote isn’t a false claim. It does not (quite) say that researchers use the terms “intentional system” and “autopilot system”, which seem like sensible English descriptions if for some bizarre reason you can’t use the shorter names. Now, I don’t know why anyone would avoid the scholarly names when for once those make sense—but I’ve also never tried to write an article for Lifehack.
What is your credence for the explanation you give, considering that eg the audience may remember reading about many poorly-supported systems with levels numbered I and II—seeing a difference between that and the recognition that humans evolved may be easier for some then evaluating journal citations.
The motivation of Kahnmann to use system I and system II isn’t to have shorter names. It’s that there are existing conceptions among people about words describing mental concepts and he doesn’t want to use them.
Wikipedia list from Kahnmann:
Emotional/logical is a different distinction then intentional/autopilot. Trained people can shut on and off emotions via their intentions and the process has little to do with being logical or calculating.
But even given them new names that scientists don’t give them might be a valid move. If you how even do that then you should be open about the fact that you invented new names. Given science public nature I also think that you should be open about why you choose certain terms and choosing new terms should come with an explanation of why you prefer them over alternatives.
The reason shouldn’t be that your organisation is named “intentional insights” and that’s why you call it the “intentional system”. Again that pattern leads to the rationality is about using system II instead of system I position with differ from the CFAR position.
In Gleb’s own summary of Thinking Fast and slow he writes:
Given that in Kahnmann’s framework intentions are generated by system I, calling system II the “intentional system” produces problems.
Explanations don’t have credence, predictions do. If you specify a prediction I can give you my credence for it.
It might be worth correcting “Greb” and “Greg” to “Gleb” in that, to forestall confusion.
Thanks.
Did you ever read about Feynman’s experience reading science textbooks for elementary school? (It’s available online here.)
There are good and bad ways to simplify.
Sure, there are people I’d rather not join the LessWrong community for status reasons. But I don’t think the resistance here is about status instead of methodology. Yes, it would be nice to have organizations devoted to helping people get from level 1 to level 2, but if you were closing your eyes and designing such an organization, would it look like this?
(Both agreeing with and refining your position, and directed less to you than the audience):
Personally, I’m at level 21, and I’m trying to raise the rest of you to my level.
Now, before you take that as a serious statement, ask yourself how you feel about that proposition, and how inclined you would be to take anything I said seriously if I actually believed that. Think about to what extent I behave like I -do- believe that, and how that changes the way what I say is perceived.
http://lesswrong.com/lw/m70/visions_and_mirages_the_sunk_cost_dilemma/ ← This post, and pretty much all of my comments, had reasonably high upvotes before I revealed what I was up to. Now, I’m not going to say it didn’t deserve to get downvoted—I learned a lot from that post that I should have known going into it—but I’d like to point out the fundamental similarities, but scaled up a level, between what I do there, and typical rationalist “education”. “Here’s a thing. It was a trick! Look at how easily I tricked you! You should now listen to what I say about how to avoid getting tricked in the future.” Worse, cognitive dissonance will make it harder to fix that weakness in the future. As I said, I learned a -lot- in that post; I tried to shove at least four levels of plots and education into it, and instead, turned people off with the first or second one. I hope I taught people something, but in retrospect, and far removed from it, I think it was probably a complete and total failure which mostly served to alienate people from the lessons I was attempted to impart.
The first step to making stupid people slightly less stupid is to make them realize the way in which they’re stupid in the first place, so that they become willing to fix it. But you can’t do that, because, obviously, people really dislike being told they’re stupid. Because there are some issues inherent in approaching other people with the assumption that they’re less than you, and that they should accept your help in raising them up. You’re asserting a higher status than them. They’re going to resent that, and cognitive dissonance is going to make them decide that the thing you’re better at, either you aren’t, or that it isn’t that important. So if you think that you can make “stupid people slightly less stupid”, you’re completely incompetent at the task.
But… show them that -you- are stupid, and show them you becoming less stupid, and cognitive dissonance will tell them that they were smarter than you, and that they already knew what you were trying to teach them. That’s a huge part of what made the Sequences so successful—riddled throughout it were admissions of Eliezer’s own weakness. “This is a mistake I made. This is what I realized. This is how I started to get past that mistake.” What made them failures, however, is the way they made those who read them feel Enlightened, like they had just Leveled Up twenty times and were now far above ordinary plebeians. The critical failure of the Sequences is that they didn’t teach humility; the lesson you -should- come away from them with is the idea that, however much Less Wrong you’ve become, you’re still deeply, deeply wrong. And that’s okay.
Which provokes a dilemma. Everybody who wants to teach rationality to others, because it leveled them up twenty times and look at those stupid people falling prey to the non-central fallacy on a constant basis, are completely unsuitable to do so.
So did I succeed? Or did I fail? And why?
This is a pretty confusing point. I have plenty of articles where I admit my failures and discuss how I learned to succeed.
Secondly, I have only started publishing on Lifehacker—published 3 so far—and my articles way outperform the average of being shared under 1K. This is the average for experienced and non-experienced writers alike. My articles have all been shared over 1K times, and some twice as much if not more. The fact that they are shared so widely is demonstrable evidence that I understand my audience and engage it well.
BTW, curious if any of these discussions have caused you to update on any of your claims to any extent?
I now assign negligible odds to the possibility that you’re a sociopath (used as a shorthand for any of a number of hostile personality disorders) masquerading as a normal person masquerading as a sociopath, and somewhat lower odds on you being a sociopath outright, with the majority of assigned probability concentrating on “normal person masquerading as sociopath” now. (Whether that’s how you would describe what you do or not, that’s how I would describe it, because the way you write lights up my “Predator” alarm board like a nearby nuke lights up a “Check Engine” light.)
Demonstrable evidence that you do so better than average isn’t the same as demonstrable evidence that you do so well.
Thanks for sharing about your updating! I am indeed a normal person, and have to put a lot of effort into this style of writing for the sake of what I perceive as a beneficial outcome.
I personally have updated away from you trolling me and see you as more engaged in a genuine debate and discussion. I see we have vastly different views on the methods of getting there, but we do seem to have broadly shared goals.
Fair enough on different interpretations of the word “well.” As I said, my articles have done twice as well as the average for Lifehack articles, so we can both agree that it is demonstrable evidence of a significant and above-average level of competency on an area where I am just starting − 3 articles so far—although the term “well” is more fuzzy.
Mmm. I typically dislike framings where A teaches B, instead of framings where B learns from A.
The Sequences certainly tried to teach humility, and some of us learned humility from The Sequences. I mean, it’s right there in the name that one is trying to asymptotically remove wrongness.
The main failing, if you want to put it that way, is that this is an online text and discussion forum, rather than a dojo. Eliezer doesn’t give people gold stars that say “yep, you got the humility part down,” and unsurprisingly people are not as good at determining that themselves as they’d like to be.
Then perhaps you’ve framed the problem you’re trying to solve in this thread wrong. [ETA: Whoops. Thought I was talking to Villiam. This makes less-than-sense directed to you.]
I don’t think that humility can be taught in this sense, only earned through making crucial mistakes, over and over again. Eliezer learned humility through making mistakes, mistakes he learned from; the practice of teaching rationality is the practice of having students skip those mistakes.
He shouldn’t, even if he could.
Oh, I definitely agree with you that trying to teach rationality to others to fix them, instead of providing a resource for interested people to learn rationality, is deeply mistaken. Where I disagree with you is the (implicit?) claim that the Sequences were written to teach instead of being a resource for learning.
Mmm. I favor Bismarck on this front. It certainly helps if the mistakes are yours, but they don’t have to be. I also think it helps to emphasize the possibility of learning sooner rather than later; to abort mistakes as soon as they’re noticed, rather than when it’s no longer possible to maintain them.
Ah! My apologies. Thought I was talking to Villiam. My responses may have made less than perfect sense.
You can learn from mistakes, but you don’t learn what it feels like to make mistakes (which is to say, exactly the same as making the right decision).
That’s where humility is important, and where the experience of having made mistakes helps. Making mistakes doesn’t feel any different from not making mistakes. There’s a sense that I wouldn’t make that mistake, once warned about it—and thinking you won’t make a mistake is itself a mistake, quite obviously. Less obviously, thinking you will make mistakes, but that you’ll necessarily notice them, is also a mistake.
The solution to the meta-level confusion (it’s turtles all the way down, anyway) is to spend a few years building up an immunity to iocane powder.
I address the concerns about the writing style and content in my just-written comment here. Let me know your thoughts about whether that helps address your concerns.
Regarding clickbait and sharing, let’s actually evaluate the baseline. I want to highlight that 2K is quite a bit higher than the average for a Lifehack article. A typical article does not rise above 1K, and that’s considered pretty good. So my articles have done really well by comparison to other Lifehack articles. Since that’s the baseline, I’m pretty happy with where the sharing is.
Why would you be disheartened if I stopped what I was trying to do?
EDIT: Also forgot to add that some of the articles you listed were not written by me but by another aspiring rationalist, so FYI.
no that does not answer to the issues I raised.
I am now going to take apart this article:
www.intentionalinsights.org/7-surprising-science-based-hacks-to-build-your-willpower
7 Surprising Science-Based Hacks To Build Your Willpower
Tempted by that second doughnut? Struggling to resist checking your phone? Shopping impulsively on Amazon? Slacking off by reading BuzzFeed instead of doing work? What you need is more willpower! Recent research shows that strengthening willpower is the real secret to the kind of self-control that can help you resist temptations and achieve your goals. The great news is that scientists say strengthening your willpower is not as hard as you might think. Here are 7 research-based hacks to strengthen your willpower!
Smile :-)
Smiling and other mood-lifting activities help improve willpower. In a recent study, scientists first drained the willpower of participants through having them resist temptation. Then, for one group, they took steps to lift people’s moods, such as giving them unexpected gifts or showing them a funny video. For another group, they just let them rest. Compared to people who just rested for a brief period, those whose moods were improved did significantly better in resisting temptation later! So next time you need to resist temptation, improve your mood! Smile or laugh, watch a funny video or two.
Clench Your Fist
Clench your fists or partake in another type of activity where you exercise self-control. Studies say that exercising self-control in any physical domain causes you to become more disciplined in other facets of life. So do whatever works for you to exercise self-control when you are trying to fight temptations: clench your fist, squeeze your eyes shut, or you can even hold in your pee, just like UK Prime Minister David Cameron.
Meditate
Photo Credit: Gleb Tsipursky meditating in the park
Meditation is great for a lot of things – reducing stress, increasing focus, managing emotions. Now research suggests it even helps us build willpower! With all these benefits, can you afford not to meditate? An easy way to get started is to spend 10 minutes a day sitting in a calm position and focusing on your breath.
Reminders
Our immediate desires to give in to temptations make it really challenging to resist them. Our emotional desires seem like a huge elephant and our rational self is like a small elephant rider by comparison. However, one way to steer the elephant is to set in physical reminders in advance to remind ourselves of what our rational self wanted to do. So put a note on your fridge that says “only one doughnut” or set an alarm clock to buzz when you want to stop playing video games.
Eat
Did you know that your willpower is powered by food? No wonder’s it’s so hard to diet! When we don’t eat, our willpower goes down the drain. The best cure is a meal rich in protein, which enables the most optimal willpower.
Self-Forgiveness
How is self-forgiveness connected to willpower? Well, what the science shows is that feelings of regret deplete your willpower. This is why those who eat a little too much ice cream and feel regret are then much more likely to just let themselves go and eat the whole pint or even gallon! Instead, when you give in to temptation, be compassionate toward yourself and forgive yourself. That way, you’ll have more willpower going forward!
Commitment
The most important thing to strengthen your willpower is commitment to doing so! Only by committing to improving your willpower every day will you be able to take the steps described above. To do so, evaluate your situation and why you want to strengthen your willpower, make a clear decision to work on improving this area, and set a long-term goal for your willpower improvement to have the kind of intentional life that you want.
Then break down this goal into specific and concrete steps that you will take based on the strategies described above. Research shows this is the best path for you to build your willpower!
So what are the specific and concrete steps that you will take to build your own willpower? Share your planned steps and the strategies that you will use in the comments section below!
To avoid missing out on content that helps you reach your goals, subscribe to the Intentional Insights monthly newsletter.
The generosity of readers like you made this article possible. If you benefited from reading it, please consider volunteering or/and making a tax-deductible contribution to Intentional Insights. Thank you for being awesome!
Struggling to resist checking your phone?
Shopping impulsively on Amazon?
Slacking off by reading BuzzFeed instead of doing work? What you need is more willpower! Recent research shows
that strengthening willpower is the real secret
to the kind of self-control that can help you resist temptations and achieve your goals. The great news is that scientists say
strengthening your willpower is not as hard as you might think. Here are 7 research-based hacks to strengthen your willpower!
Smile :-)
Smiling and other mood-lifting activities help improve willpower. In a recent study,
first drained the willpower of participants through having them resist temptation. Then, for one group, they took steps to lift people’s moods, such as giving them unexpected gifts or showing them a funny video. For another group, they just let them rest. Compared to people who just rested for a brief period, those whose moods were improved did significantly better in resisting temptation later!
So next time you need to resist temptation, improve your mood! Smile or laugh, watch a funny video or two.
Clench Your Fist
Clench your fists or partake in another type of activity where you exercise self-control. Studies say
that exercising self-control in any physical domain causes you to become more disciplined in other facets of life.
So do whatever works for you to exercise self-control when you are trying to fight temptations: clench your fist, squeeze your eyes shut, or you can even hold in your pee
I’m gonna stop because this feels too much like a waste of time.
I realise it’s easier to criticise than generate content; I plan to try to do that in contrast to this article if I get the time.
On what basis? It matches my experience, something similar has been discussed on LW before, and it would seem to match various theoretical considerations about human psychology.
This seems like a very strong and mostly unjustified claim.
E.g. even something like the Getting Things Done system, which works very well for lots and lots of people and has been covered and promoted in numerous places, is still something that’s relatively unknown outside the kinds of circles where people are interested in this kind of thing. A lot of people in the industrialized world could benefit from it, but most haven’t even tried it.
Ideas spread slowly, and often at a rate that’s only weakly correlated with their usefulness.
Given that you didn’t address the following, let me address it
To be that raises a harmful untruth flag. Roy Baumeister suggests that meals help with willpower through glucose. To me the claim that it’s protein that builds willpower looks unsubstantiated. It certainly not backed up by the Israeli judges.
Where does the harm come into play? I understand the nutritional consensus to be that most people eat meals with too much protein. Nutrition science is often wrong, but that means that one should be careful about giving people the advice of raising the protein content of their meals.
The nutritional consensus is also not about optimizing willpower. I would be somewhat skeptical of the claim that the willpower optimizing meal just luckily happens to be identical to the health optimizing meal.
I haven’t made that claim.
In the sense that you didn’t make it, neither did I say that you did.
My argument is about two issues:
1) There no reason to belief that protein increases willpower.
2) If you tell people a lie to make them improve their diet it’s at least defensible they end of healthier as a result. If your lie however makes them eat a less healthy diet you really screwed up.
Apart from that, I don’t believe that eating glucose directly to increase your willpower is a good idea or healthy.
Why not helpful?
I speak in the tone of listicle articles reluctantly, as I wrote above. It’s icky, but necessary to get this past the editors at Lifehack and elsewhere.
Actually, it is linked to. You can check out the article for the link, but here is the link itself if you’re curious: www.albany.edu/~muraven/publications/promotion files/articles/tice et al, 2007.pdf
This I just don’t get. If experiments say you should watch a funny video, and they do as the link above states, why is this not-even wrong territory?