Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and—to the extent that I understand Less-Wrong jargon—“winning.”
Valuable to whom? Value and status aren’t universal constants.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not
show epistemic rationality is about winning..
The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society
where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth.
In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won’t be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not “about” winning and IR is.
Epistemic rationality is a tool.
ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central
point.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that’s not the only reason it’s valuable.
In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded.
But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler’s teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it’s still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone’s life worse off as a result. (Perhaps in the land of Offler’s high priests, clandestine ER is the wisest approach).
If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler’s people.
In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress.
NASA wasn’t just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER.
That is probably true, but not equivalent to your original point.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake.
I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end?
I’m working off this quote from EY as my definition of ER. This definition seems silent on the means-end question.
Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed “truth” or “accuracy”, and we’re happy to call it that.
This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People’s motivations for doing what they do are complex. Try as I might, I don’t think I’ll ever fully understand why my primate brain does it what it does. And I don’t think anyone’s primate brain is seeking truth for its own sake and for no other reasons.
Also, arguing about definitions is the least useful form of philosophy, so if that’s the direction we’re going, I’m tapping out.
But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set.
[And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal?
From the wiki:-
Epistemic rationality is that part of rationality which involves achieving accurate beliefs about the world. ….. It can be seen as a form of instrumental rationality in which knowledge and truth are goals in themselves, whereas in other forms of instrumental rationality, knowledge and truth are only potential aids to achieving goals.
I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don’t know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.
Epistemic rationality isn’t about winning?
Valuable to whom? Value and status aren’t universal constants.
You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..
The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth.
In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won’t be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not “about” winning and IR is.
ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central point.
What I’m saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters.
I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that’s not the only reason it’s valuable.
In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded.
But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler’s teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it’s still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone’s life worse off as a result. (Perhaps in the land of Offler’s high priests, clandestine ER is the wisest approach).
If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler’s people.
In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress.
NASA wasn’t just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon.
If the terminal goal is to live the best possible life (“winning”), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
That is probably true, but not equivalent to your original point.
I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation.
It’s defined that way!!!!!
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can’t seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end?
I’m working off this quote from EY as my definition of ER. This definition seems silent on the means-end question.
This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People’s motivations for doing what they do are complex. Try as I might, I don’t think I’ll ever fully understand why my primate brain does it what it does. And I don’t think anyone’s primate brain is seeking truth for its own sake and for no other reasons.
Also, arguing about definitions is the least useful form of philosophy, so if that’s the direction we’re going, I’m tapping out.
But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set.
[And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]
From the wiki:-
ER vs IR. I am not sure what your question is.
I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don’t know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.