CFAR’s vision page is also a good summary of what this community considers rationality to be about.
You will find that Scotts article that summarizes the knowledge that LW produced doesn’t even use the word logic. The CFAR vision uses the word one time but only near the bottom of the article.
One of the core insights of LW is that teaching people who want to be rational to be rational isn’t easy. We don’t have an issue guide to rationality that we can give people and then they become rational.
When it comes to winning other people, most people do have goals that they care about. If you tell the body builder about the latest research on supplements or muscle building, then he’s going to be interested. Have that knowledge makes him for effective at the goals that he cares about. For him that knowledge isn’t useless nerdy stuff.
As far as rationality is about winning, the body builder cares about winning in the domain of muscle building.
Of course you also have to account for status effects. Some people pretend to care about certain goals but are not willing to actually efficiently pursue those goals.
There not any point where someone has to self identify as rationalist.
Thanks that’s interesting. Scott is always a good read.
Again, I’d have to disagree that the “winning” paradigm is useful in encouraging rational thought. Irrational thought can in many instances at least appear to be a good strategy for what the average person undestands as “winning”, and it additionally evokes a highly competitve psychological state that is a a major source of bias.
A used car salesperson convincing themselves that what they’re selling isn’t a piece of crud is an example of where irrationality is a “good” (effective) strategy. I don’t think that’s what we are trying to encourage here. That’s why I say instrumental truthiness—the truth part is important too.
I also maintain that focus on “winning” is psychologically in conflict with truth seeking. Politics = mind killer is best example.
I think the orthodox LW view would be that this used car salesperson might have an immoral utility function but that he isn’t irrational.
I also maintain that focus on “winning” is psychologically in conflict with truth seeking.
That basically means that sometimes the person who seeks the truth doesn’t win. That outcome isn’t satisfactory to Eliezer.
In Rationality is Systematized Winning he writes:
If the “irrational” agent is outcompeting you on a systematic and predictable basis, then it is time to reconsider what you think is “rational”.
Of course you can define rationality for yourself differently but it’s a mistake to project your own goals on others.
I am suprised that a significant group of people think that rationality is inclusive of useful false beliefs. Wouldn’t we call LW an effectiveness forum, rather than a rationalist forum in that case?
That basically means that sometimes the person who seeks the truth doesn’t win.
I think you’re reading too much into that one quite rhetorical article, but I acknowledge he prioritises “winning” quite highly. I think he ought to revise that view. Trying to win with false beliefs risks not achieving your goals, but being oblivious to that fact. Like a mad person killing their friends because he/she thinks they’ve turned into evil dog-headed creatures or some such (exaggeration to illustrate my point).
Of course you can define rationality for yourself differently but it’s a mistake to project your own goals on others.
Fair point. And maybe you’re right I’m in the minority… I’m still not yet certain. I do note that upvotes does not indicate agreement, only a feeling that the article is an interesting read etc. Also, I note many comments disagree with article. It warrants further investigation for me though.
I am suprised that a significant group of people think that rationality is inclusive of useful false beliefs.
Often they use “instrumental rationality” for that meaning and “epistemic rationality” for the other one. Searching this site for epistemic instrumental returns some relevant posts.
http://slatestarcodex.com/2014/03/13/five-years-and-one-week-of-less-wrong/ should be worth reading to get up to speed on the currrent LW ideology.
CFAR’s vision page is also a good summary of what this community considers rationality to be about.
You will find that Scotts article that summarizes the knowledge that LW produced doesn’t even use the word logic. The CFAR vision uses the word one time but only near the bottom of the article.
One of the core insights of LW is that teaching people who want to be rational to be rational isn’t easy. We don’t have an issue guide to rationality that we can give people and then they become rational.
When it comes to winning other people, most people do have goals that they care about. If you tell the body builder about the latest research on supplements or muscle building, then he’s going to be interested. Have that knowledge makes him for effective at the goals that he cares about. For him that knowledge isn’t useless nerdy stuff. As far as rationality is about winning, the body builder cares about winning in the domain of muscle building.
Of course you also have to account for status effects. Some people pretend to care about certain goals but are not willing to actually efficiently pursue those goals. There not any point where someone has to self identify as rationalist.
Thanks that’s interesting. Scott is always a good read.
Again, I’d have to disagree that the “winning” paradigm is useful in encouraging rational thought. Irrational thought can in many instances at least appear to be a good strategy for what the average person undestands as “winning”, and it additionally evokes a highly competitve psychological state that is a a major source of bias.
If you consider good strategies to be irrational than you mean something different with rational than what the term usually refers to on LW.
A used car salesperson convincing themselves that what they’re selling isn’t a piece of crud is an example of where irrationality is a “good” (effective) strategy. I don’t think that’s what we are trying to encourage here. That’s why I say instrumental truthiness—the truth part is important too.
I also maintain that focus on “winning” is psychologically in conflict with truth seeking. Politics = mind killer is best example.
I think the orthodox LW view would be that this used car salesperson might have an immoral utility function but that he isn’t irrational.
That basically means that sometimes the person who seeks the truth doesn’t win. That outcome isn’t satisfactory to Eliezer. In Rationality is Systematized Winning he writes:
Of course you can define rationality for yourself differently but it’s a mistake to project your own goals on others.
A recent article title Truth, it’s not that great got 84% upvotes on LW.
I am suprised that a significant group of people think that rationality is inclusive of useful false beliefs. Wouldn’t we call LW an effectiveness forum, rather than a rationalist forum in that case?
I think you’re reading too much into that one quite rhetorical article, but I acknowledge he prioritises “winning” quite highly. I think he ought to revise that view. Trying to win with false beliefs risks not achieving your goals, but being oblivious to that fact. Like a mad person killing their friends because he/she thinks they’ve turned into evil dog-headed creatures or some such (exaggeration to illustrate my point).
Fair point. And maybe you’re right I’m in the minority… I’m still not yet certain. I do note that upvotes does not indicate agreement, only a feeling that the article is an interesting read etc. Also, I note many comments disagree with article. It warrants further investigation for me though.
Often they use “instrumental rationality” for that meaning and “epistemic rationality” for the other one. Searching this site for
epistemic instrumental
returns some relevant posts.