The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn’t prime action; concrete construal does.
Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.
Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it’s in conflict with other things you need for a successful career. TV’s Dr. House is an extreme example, but most real people are not as good at the technical part of their job as House nor are the quality of their results usually as important.
Both of these things combine to create the next major problem: a disposition to non-co-operative behavior, aka the “why can’t our kind get along?” problem.
Yes, not everyone has these issues, diverse community, etc. But, as a stereotypical and somewhat flippant summary, the issue is that simply by the nature of valuing truth—precise truth, rather then the mere idea of truth—one is treating it as being more important than other goals. That means it’s rather unlikely that a person interested in it will be sufficiently interested in other goals to make progress there. I would expect it more likely that a person who is not naturally inclined towards rationalism would be able to put it to good use, than someone who’s just intellectually interested in rationalism as a conversation topic or as an ideal to aspire to.
To put it another way, if you already have “something to protect”, such that rationality is a means towards that end, then rationality can be of some value. If you value rationality for its own sake, well, then that is your goal, and so you can perhaps be called “successful” in relation to it, but it’s not likely that anyone who doesn’t value rationality for its own sake will consider your accomplishments impressive.
So, the truth value of “rationalists don’t win” depends on your definition of “win”. Is it “win at achieving their own, perhaps less-than-socially-valued goals? Or “win at things that are impressive to non-rationalists”? I think the latter category is far less likely to occur for those whose terminal values are aimed somewhere near rationality or truth for its own sake.
Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and—to the extent that I understand Less-Wrong jargon—“winning.”
Think about markets: If you have accurate and non-consensus opinions about the values of assets or asset classes, you should be able to acquire great wealth. In that vein, there are plenty of rationalists who apply epistemic rationality to market opinions and do very well for themselves. Think Charlie Munger, Warren Buffett, Bill Gates, Peter Thiel, or Jeff Bezos. Winning!
If you know better than most who will win NBA games, you can make money betting on the games. E.g., Haralabos Voulgaris. Winning!
Know what health trends, diet trends, and exercise trends improve your chances for a longer life? Winning!
If you have an accurate and well-honed understanding of what pleases the crowd at Less Wrong, and you can articulate those points well, you’ll get Karma points and higher status in the community. Winning!
Economic markets, betting markets, health, and certain status-competitions are all contexts where epistemic rationality is potentially valuable.
Occasionally, however, epistemic rationality can be demonstrated in ways that are context-inappropriate – and thus lead to lower status. Not winning!
For example, if you correct someone’s grammar the first time you meet him or her at a cocktail party. Not winning!
Demonstrate that your boss is dead wrong in front of a group of peers in way that embarrasses her? Not winning!
Constantly argue about LW-type topics with people who don’t like to argue? Not winning!
Epistemic rationality is a tool. It gives you power to do things you couldn’t do otherwise. But status-games require a deft understanding of when it is appropriate and when it is not appropriate to demonstrate the greater coherence of one’s beliefs to reality to others (which itself strikes me as a form of epistemic rationality of social awareness). Those who get it right are the winners. Those who do not are the losers.
Well, technically speaking, it isn’t. It is the propensity to select courses of action which will most likely lead to the outcomes your prefer. Correcting grammar on the first date is not a misapplication of epistemic rationality, it just is NOT epistemically rational (assuming reasonable context, e.g. you are not deliberately negging and you are less interested in grammar than in this particular boy/girl).
Epistemic rationality doesn’t save you from having bad goals. Or inconsistent ones.
ETA: Ah, sorry. I had a brain fart and was writing “epistemic rationality” while meaning “instrumental rationality”. So, er, um, disregard.
(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)
Epistemic rationality is not equivalent to “being a Spockish asshole.” It simply means that one values rationality as an end and not just a means. If you do not value correcting people’s grammar for its own sake, then there is no reason to correct someone’s grammar. But that is an instrumental statement, so I suppose I should step back...
If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of “this ought to be done” is instrumental. Epistemic only covers “this is true/false.”
Epistemic rationality is not equivalent to “being a Spockish asshole.”
Yes, of course. Notably, epistemic rationality only requires you to look for and to prefer truth. It does not require you to shove the truth you found into everyone else’s face.
If you think that epistemic and instrumental rationality would disagree at certain points
One can find edge cases, but generally speaking if you treat epistemic rationality narrowly (see above) I would expect such a disagreement to arise very rarely.
On the other hand there are, as usual, complications :-/ For example, you might not go find the truth because doing this requires resources (e.g. time) and you feel these resources would be better spent elsewhere. Or if you think you have difficulties controlling your mind (see the rider and the elephant metaphor) you might find useful some tricks which involve deliberate denial of some information to yourself.
I think this is a bad example. The example seems like an instrumental example. Epistemic alone would have you correct the grammar because that’s good epistemics. Instrumental would have you bend the rules for the other goals you have on the pathway to winning.
Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.
How could correcting grammar be good epistemics? The only question of fact there is a practical one—how various people will react to the grammar coming out of your word-hole.
Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and—to the extent that I understand Less-Wrong jargon—“winning.”
Valuable to whom? Value and status aren’t universal constants.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not
show epistemic rationality is about winning..
The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society
where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth.
In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won’t be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not “about” winning and IR is.
Epistemic rationality is a tool.
ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central
point.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that’s not the only reason it’s valuable.
In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded.
But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler’s teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it’s still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone’s life worse off as a result. (Perhaps in the land of Offler’s high priests, clandestine ER is the wisest approach).
If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler’s people.
In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress.
NASA wasn’t just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER.
That is probably true, but not equivalent to your original point.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake.
I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end?
I’m working off this quote from EY as my definition of ER. This definition seems silent on the means-end question.
Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed “truth” or “accuracy”, and we’re happy to call it that.
This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People’s motivations for doing what they do are complex. Try as I might, I don’t think I’ll ever fully understand why my primate brain does it what it does. And I don’t think anyone’s primate brain is seeking truth for its own sake and for no other reasons.
Also, arguing about definitions is the least useful form of philosophy, so if that’s the direction we’re going, I’m tapping out.
But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set.
[And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal?
From the wiki:-
Epistemic rationality is that part of rationality which involves achieving accurate beliefs about the world. ….. It can be seen as a form of instrumental rationality in which knowledge and truth are goals in themselves, whereas in other forms of instrumental rationality, knowledge and truth are only potential aids to achieving goals.
I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don’t know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.
Upvoted, but I want to throw in the caveat that some baseline level of epistemic rationalism is very useful for winning. Schizophrenics tend to have a harder time of things than non-schizophrenics.
That is a limitation of looking at this community specifically, but the general sense of the question can also be approached by looking at communities for specific activities that have strong norms of rationality.
I think most of the time rationality is not helpful for applied goals because doing something well usually requires domain specific knowledge that’s acquired through experience, and yet experience alone is almost always sufficient for success. In cases where the advice of rationality and experience conflict, oftentimes experience wins even if it should not, because the surrounding social context is built by and for the irrational majority. If you make the same mistake everyone else makes you are in little danger, but if you make a unique mistake you are in trouble.
Rationality is most useful when you’re trying to find truths that no one else has found before. Unfortunately, this is extremely difficult to do even with ideal reasoning processes. Rationality does offer some marginal advantage in truth seeking, but because useful novel truths are so rare, most often the costs outweigh the benefits. Once a good idea is discovered, oftentimes irrational people are simply able to copy whoever invented the idea, without having to bear all the risk involved with the process of the idea’s creation. And then, when you consider that perfect rationality is beyond mortal reach, the situation begins to look even worse. You need a strategy that lets you make better use of truth than other people can, in addition to the ability to find truth more easily, if you want to have a decent chance to translate skill in rationality into life victories.
Yes. This is much like I said in my comment: people from Less Wrong are simply much more interested in truth in itself, and as you say here, there is little reason to expect this to make them more effective in attaining other goals.
The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn’t prime action; concrete construal does.
Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.
Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it’s in conflict with other things you need for a successful career. TV’s Dr. House is an extreme example, but most real people are not as good at the technical part of their job as House nor are the quality of their results usually as important.
Both of these things combine to create the next major problem: a disposition to non-co-operative behavior, aka the “why can’t our kind get along?” problem.
Yes, not everyone has these issues, diverse community, etc. But, as a stereotypical and somewhat flippant summary, the issue is that simply by the nature of valuing truth—precise truth, rather then the mere idea of truth—one is treating it as being more important than other goals. That means it’s rather unlikely that a person interested in it will be sufficiently interested in other goals to make progress there. I would expect it more likely that a person who is not naturally inclined towards rationalism would be able to put it to good use, than someone who’s just intellectually interested in rationalism as a conversation topic or as an ideal to aspire to.
To put it another way, if you already have “something to protect”, such that rationality is a means towards that end, then rationality can be of some value. If you value rationality for its own sake, well, then that is your goal, and so you can perhaps be called “successful” in relation to it, but it’s not likely that anyone who doesn’t value rationality for its own sake will consider your accomplishments impressive.
So, the truth value of “rationalists don’t win” depends on your definition of “win”. Is it “win at achieving their own, perhaps less-than-socially-valued goals? Or “win at things that are impressive to non-rationalists”? I think the latter category is far less likely to occur for those whose terminal values are aimed somewhere near rationality or truth for its own sake.
Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.
Epistemic rationality isn’t about winning?
Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and—to the extent that I understand Less-Wrong jargon—“winning.”
Think about markets: If you have accurate and non-consensus opinions about the values of assets or asset classes, you should be able to acquire great wealth. In that vein, there are plenty of rationalists who apply epistemic rationality to market opinions and do very well for themselves. Think Charlie Munger, Warren Buffett, Bill Gates, Peter Thiel, or Jeff Bezos. Winning!
If you know better than most who will win NBA games, you can make money betting on the games. E.g., Haralabos Voulgaris. Winning!
Know what health trends, diet trends, and exercise trends improve your chances for a longer life? Winning!
If you have an accurate and well-honed understanding of what pleases the crowd at Less Wrong, and you can articulate those points well, you’ll get Karma points and higher status in the community. Winning!
Economic markets, betting markets, health, and certain status-competitions are all contexts where epistemic rationality is potentially valuable.
Occasionally, however, epistemic rationality can be demonstrated in ways that are context-inappropriate – and thus lead to lower status. Not winning!
For example, if you correct someone’s grammar the first time you meet him or her at a cocktail party. Not winning!
Demonstrate that your boss is dead wrong in front of a group of peers in way that embarrasses her? Not winning!
Constantly argue about LW-type topics with people who don’t like to argue? Not winning!
Epistemic rationality is a tool. It gives you power to do things you couldn’t do otherwise. But status-games require a deft understanding of when it is appropriate and when it is not appropriate to demonstrate the greater coherence of one’s beliefs to reality to others (which itself strikes me as a form of epistemic rationality of social awareness). Those who get it right are the winners. Those who do not are the losers.
Well, technically speaking, it isn’t. It is the propensity to select courses of action which will most likely lead to the outcomes your prefer. Correcting grammar on the first date is not a misapplication of epistemic rationality, it just is NOT epistemically rational (assuming reasonable context, e.g. you are not deliberately negging and you are less interested in grammar than in this particular boy/girl).
Epistemic rationality doesn’t save you from having bad goals. Or inconsistent ones.
ETA: Ah, sorry. I had a brain fart and was writing “epistemic rationality” while meaning “instrumental rationality”. So, er, um, disregard.
(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)
Epistemic rationality is not equivalent to “being a Spockish asshole.” It simply means that one values rationality as an end and not just a means. If you do not value correcting people’s grammar for its own sake, then there is no reason to correct someone’s grammar. But that is an instrumental statement, so I suppose I should step back...
If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of “this ought to be done” is instrumental. Epistemic only covers “this is true/false.”
Yes, of course. Notably, epistemic rationality only requires you to look for and to prefer truth. It does not require you to shove the truth you found into everyone else’s face.
One can find edge cases, but generally speaking if you treat epistemic rationality narrowly (see above) I would expect such a disagreement to arise very rarely.
On the other hand there are, as usual, complications :-/ For example, you might not go find the truth because doing this requires resources (e.g. time) and you feel these resources would be better spent elsewhere. Or if you think you have difficulties controlling your mind (see the rider and the elephant metaphor) you might find useful some tricks which involve deliberate denial of some information to yourself.
So how does it differ from instrumental rationality?
See ETA to the comment.
I think this is a bad example. The example seems like an instrumental example. Epistemic alone would have you correct the grammar because that’s good epistemics. Instrumental would have you bend the rules for the other goals you have on the pathway to winning.
“See ETA to the comment.” Lumifer meant instrumental rationality.
Comment was before his eta. Ta.
Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.
How could correcting grammar be good epistemics? The only question of fact there is a practical one—how various people will react to the grammar coming out of your word-hole.
Epistemic rationality isn’t about winning?
Valuable to whom? Value and status aren’t universal constants.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..
The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth.
In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won’t be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not “about” winning and IR is.
ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central point.
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that’s not the only reason it’s valuable.
In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded.
But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler’s teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it’s still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone’s life worse off as a result. (Perhaps in the land of Offler’s high priests, clandestine ER is the wisest approach).
If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler’s people.
In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress.
NASA wasn’t just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
That is probably true, but not equivalent to your original point.
I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation.
It’s defined that way!!!!!
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end?
I’m working off this quote from EY as my definition of ER. This definition seems silent on the means-end question.
This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People’s motivations for doing what they do are complex. Try as I might, I don’t think I’ll ever fully understand why my primate brain does it what it does. And I don’t think anyone’s primate brain is seeking truth for its own sake and for no other reasons.
Also, arguing about definitions is the least useful form of philosophy, so if that’s the direction we’re going, I’m tapping out.
But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set.
[And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]
From the wiki:-
ER vs IR. I am not sure what your question is.
I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don’t know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.
Upvoted, but I want to throw in the caveat that some baseline level of epistemic rationalism is very useful for winning. Schizophrenics tend to have a harder time of things than non-schizophrenics.
That is a limitation of looking at this community specifically, but the general sense of the question can also be approached by looking at communities for specific activities that have strong norms of rationality.
I think most of the time rationality is not helpful for applied goals because doing something well usually requires domain specific knowledge that’s acquired through experience, and yet experience alone is almost always sufficient for success. In cases where the advice of rationality and experience conflict, oftentimes experience wins even if it should not, because the surrounding social context is built by and for the irrational majority. If you make the same mistake everyone else makes you are in little danger, but if you make a unique mistake you are in trouble.
Rationality is most useful when you’re trying to find truths that no one else has found before. Unfortunately, this is extremely difficult to do even with ideal reasoning processes. Rationality does offer some marginal advantage in truth seeking, but because useful novel truths are so rare, most often the costs outweigh the benefits. Once a good idea is discovered, oftentimes irrational people are simply able to copy whoever invented the idea, without having to bear all the risk involved with the process of the idea’s creation. And then, when you consider that perfect rationality is beyond mortal reach, the situation begins to look even worse. You need a strategy that lets you make better use of truth than other people can, in addition to the ability to find truth more easily, if you want to have a decent chance to translate skill in rationality into life victories.
What is “rationality” even supposed to be if not codified and generalized experience?
Yes. This is much like I said in my comment: people from Less Wrong are simply much more interested in truth in itself, and as you say here, there is little reason to expect this to make them more effective in attaining other goals.