I think you may have missed 3. Most people optimise for perceived status
Or was that included under ‘folk psychology’?
Possibly, or I just thought it not worth retaining. “People care what other people think of them” is the same idea, but without the LessWrong jargon, and as such a truism known to everyone.
The key thing is this: when a rationalist is investigating a bias or some irrational behavior, they may notice that there seems to be a social influence on their thinking, think to themselves “well that’s obviously silly and wrong”, and then stop there. They go on believing that rationality has to be painful, that we have to do something to overpower these instincts, and that the only way to succeed is to look for ways to trick their unconscious mind into having a belief that seems more appropriate.
An alternative to this approach is to keep going, to look deeper at what’s really going on, spend hours or days looking for something sensible that the unconscious could possibly be doing, until enough pieces come together and suddenly you say “Oh. That’s what’s going on.” And then the most important part, you can solve the problem, so that it’s not hard or painful anymore.
Or for something more direct and operable, if you notice that your aversion to something is that you don’t want to look stupid, rather than try to power through it, look for ways that you could do the same thing without looking stupid. In fact if you look at a lot of the useful rationality techniques, the way they help us out is by doing this very thing.
when a rationalist is investigating a bias or some irrational behavior, they may notice that there seems to be a social influence on their thinking, think to themselves “well that’s obviously silly and wrong”, and then stop there. They go on believing that rationality has to be painful, that we have to do something to overpower these instincts, and that the only way to succeed is to look for ways to trick their unconscious mind into having a belief that seems more appropriate.
I can’t say I’ve particularly noticed this.
An alternative to this approach is to keep going, to look deeper at what’s really going on
Possibly, or I just thought it not worth retaining. “People care what other people think of them” is the same idea, but without the LessWrong jargon, and as such a truism known to everyone.
The key thing is this: when a rationalist is investigating a bias or some irrational behavior, they may notice that there seems to be a social influence on their thinking, think to themselves “well that’s obviously silly and wrong”, and then stop there. They go on believing that rationality has to be painful, that we have to do something to overpower these instincts, and that the only way to succeed is to look for ways to trick their unconscious mind into having a belief that seems more appropriate.
An alternative to this approach is to keep going, to look deeper at what’s really going on, spend hours or days looking for something sensible that the unconscious could possibly be doing, until enough pieces come together and suddenly you say “Oh. That’s what’s going on.” And then the most important part, you can solve the problem, so that it’s not hard or painful anymore.
Or for something more direct and operable, if you notice that your aversion to something is that you don’t want to look stupid, rather than try to power through it, look for ways that you could do the same thing without looking stupid. In fact if you look at a lot of the useful rationality techniques, the way they help us out is by doing this very thing.
A blog that takes a rational approach to writing improvement is Disputed Issues
I can’t say I’ve particularly noticed this.
This is what I am more familiar with myself.
Cool!