There is a hard limit to how far you can go as a rationalist without scientific literacy; and there is similarly a hard limit to scientific literacy without cultivated rationality.
But they do not share a 1:1 correlation, certainly.
There is a hard limit to how far you can go as a rationalist without scientific literacy; and there is similarly a hard limit to scientific literacy without cultivated rationality.
This is vague and sounds false. I don’t know where you’re getting the idea of a hard limit—as Desrtopa noted, plenty of scientists more studied than you or I believe in God. I especially don’t know what you mean by ‘far’.
Vague only because I didn’t state it in mathematical precision. For good reason: it’s not a statement of precise measurement. As to the ‘apparent’ falseness… as I illustrate further below: quite frankly, you ought to know better. If this comes across as condescending, well… read the below.
as Desrtopa noted, plenty of scientists more studied than you or I believe in God.
Certainly. And nothing I said gainsaid that. But I know that those self-same scientists do not also believe in witchcraft that can let you fly around on broomsticks. Or that God will smite the unbelievers if they live as Holy Warriors of God. Their scientific literacy has informed them too deeply about how the world works and as a result forced their religiosity to conform to large extents to what they do know. While they may not—indeed, generally do not place direct instrumental emphasis on explicitly becoming more-rational, and fall into many very common traps… there is a large body of philosophy behind the concept we call “scientific literacy”; inclusive of standards such as the principle of parsimony, the Copernican Principle, the differentiation between anecdotes and evidence, etc., etc.. -- and all of these necessarily have an impact on the degree to which a person is rational or irrational. (There are deviations individually from this pattern, certainly. But to emphasize for example the fact that scientists can be religious ignores entirely the effect, on aggregate, that scientific literacy has on religiosity ).
Conversely; if you don’t know the necessary science you will be unable to fully mature to the limits of what humans can achieve in terms of cultivated rationality.
“Sounds false”. I’m disappointed that your comment has been upvoted here. This should have been self-evident to cursory examination: the direct correlation between cognitive science, behavioral economics, and the direct instrumental value of the increased efficacy of greater scientific literacy (to better achieve the terminal value of being better able to distinguish between true and false; to “be less wrong”), etc., etc.. -- they all are fundamntally requisite if one intends to deeply persue achieving greater rationality in their life and person. To “go further as a rationalist”—to “Be Stronger”.
I especially don’t know what you mean by ‘far’.
Please spare us references to inferential gaps when I say—you should have.
terminal value of being better able to distinguish between true and false; to “be less wrong”
I don’t think that’s quite a terminal value.
Regardless, modern science aims to minimize false positives. Strong ability in that is hard to compare to weaker abilities to adeptly manage the ratio of false positive to false negative errors, calculating expected value of information, account for cognitive biases, etc.
Two people, each having one of the ability sets I described above, would perform differently in different scenarios, with variables being environment and values. It may be possible to objectively compute values for each describable skill set. Assuming one can and there are hard limits, as no person will approach either hard limit, and even if they could approach one hard limit the best thing to do would probably be to reach a Pareto optimum far from the limit it could nearly reach, I say this is vague and sounds true but not important.
Then you and I do not share terminal values. I value being “right” because it is “right”. I disvalue being “wrong” because it is wrong. These are practically tautological. :)
I say this is vague and sounds true but not important.
Well… there is significance in judging the ability of an arbitrary individual from a given culture to achieve excellence in either quality by examining the availability of the other, especially when attempting to compare the ‘greatness’ of their achievements. But that has less to do with the hardness of the limits and more to do with the strength of correlation. (With the caveat that the correlations are only statistical; individuals can and do violate those correlations quite frequently—a testament to how skilled human beings are at being inherently contradictory.)
The body of knowledge that today comprises cognitive science and behavioral economics is something our predecessors of a century ago did not have. I should expect, as a result of this—should the information be widely disseminated (with fidelity) over time, to see something equivalent to the Flynn Effect in terms of what Eliezer calls the “sanity waterline”. (With people like Cialdini and Ariely newly entering into the arena of the ‘grand marketplace of ideas’, we might see a superior result to that goal than the folks at Snopes have achieved with their individual/piecemeal approach.)
Correspondence of beliefs to reality being desirable is no closer to being a tautology than financial institutes being on the side of rivers, undercover spies digging tunnels in the ground, or spectacles being drinking vessels.
Correspondence of beliefs to reality being desirable is no closer to being a tautology
If I had said something that meant something loosely correlated to this, your point would be valid. Instead, what I said was: “I value being right because it is right; I disvalue being wrong because it is wrong.”
Your first paragraph here seems completely unnecessary. Your next paragraph walks back the claim I thought you were making about “cultivated” rationality. If you just used the wrong word before, that seems understandable, but you shouldn’t get snippy when people can’t read your mind.
The paragraph beginning “Sounds false” seems so ungrammatical that I can’t tell what it means other than ‘Science good.’
Your first paragraph here seems completely unnecessary.
I disagree. Strongly. LW-ers in general need to recall that a linguistically phrased statement is not “more wrong” by nature than a mathematically phrased one—especially when the topic itself does not lend itself to such a thing. (Though not a direct correlation this maps well to the notions of Fake Utility Functions.)
Made up numbers are worse than ‘vague’—they are counter-productive: they “prime” the reader (and the writer) towards specific subsets of the available space.
If you just used the wrong word before, that seems understandable,
It wasn’t the wrong word at all. It was exactly the right word. All I did was flush it out further. That would be why I re-used the same exact term: “you will be unable to fully mature to the limits of what humans can achieve in terms of cultivated rationality.”.
but you shouldn’t get snippy when people can’t read your mind.
… Is it really so hard to not follow such a simple request? “Please spare us references to inferential gaps”
This wasn’t a question of mind reading but rather of not being profoundly ignorant as to the topic upon which truth/false judgments were being exercised. I explained this in depth and expressed my disappointment with this current state of affairs.
The paragraph beginning “Sounds false” seems so ungrammatical
… It’s called the conversational tone. You use it, in writing, when you are holding a conversation.
that I can’t tell what it means other than ‘Science good.’
I have no reaction to this other than contempt. If I had meant to say “science good” I would have said it. I was quoting the guy I was responding to. Just why, pray tell, is this such a difficult concept to grasp? (I will point out that my emotional reactions here indicate to me that you had no serious intention of conveying anything with this statement other than to take a petty swing at someone whose tone you disliked because it was mean, and “mean is bad”. This site would do a great deal better if people would grow out of White Knighting.)
Not nearly as highly correlated as one would hope.
There is a hard limit to how far you can go as a rationalist without scientific literacy; and there is similarly a hard limit to scientific literacy without cultivated rationality.
But they do not share a 1:1 correlation, certainly.
This is vague and sounds false. I don’t know where you’re getting the idea of a hard limit—as Desrtopa noted, plenty of scientists more studied than you or I believe in God. I especially don’t know what you mean by ‘far’.
Vague only because I didn’t state it in mathematical precision. For good reason: it’s not a statement of precise measurement. As to the ‘apparent’ falseness… as I illustrate further below: quite frankly, you ought to know better. If this comes across as condescending, well… read the below.
Certainly. And nothing I said gainsaid that. But I know that those self-same scientists do not also believe in witchcraft that can let you fly around on broomsticks. Or that God will smite the unbelievers if they live as Holy Warriors of God. Their scientific literacy has informed them too deeply about how the world works and as a result forced their religiosity to conform to large extents to what they do know. While they may not—indeed, generally do not place direct instrumental emphasis on explicitly becoming more-rational, and fall into many very common traps… there is a large body of philosophy behind the concept we call “scientific literacy”; inclusive of standards such as the principle of parsimony, the Copernican Principle, the differentiation between anecdotes and evidence, etc., etc.. -- and all of these necessarily have an impact on the degree to which a person is rational or irrational. (There are deviations individually from this pattern, certainly. But to emphasize for example the fact that scientists can be religious ignores entirely the effect, on aggregate, that scientific literacy has on religiosity ).
Conversely; if you don’t know the necessary science you will be unable to fully mature to the limits of what humans can achieve in terms of cultivated rationality.
“Sounds false”. I’m disappointed that your comment has been upvoted here. This should have been self-evident to cursory examination: the direct correlation between cognitive science, behavioral economics, and the direct instrumental value of the increased efficacy of greater scientific literacy (to better achieve the terminal value of being better able to distinguish between true and false; to “be less wrong”), etc., etc.. -- they all are fundamntally requisite if one intends to deeply persue achieving greater rationality in their life and person. To “go further as a rationalist”—to “Be Stronger”.
Please spare us references to inferential gaps when I say—you should have.
I don’t think that’s quite a terminal value.
Regardless, modern science aims to minimize false positives. Strong ability in that is hard to compare to weaker abilities to adeptly manage the ratio of false positive to false negative errors, calculating expected value of information, account for cognitive biases, etc.
Two people, each having one of the ability sets I described above, would perform differently in different scenarios, with variables being environment and values. It may be possible to objectively compute values for each describable skill set. Assuming one can and there are hard limits, as no person will approach either hard limit, and even if they could approach one hard limit the best thing to do would probably be to reach a Pareto optimum far from the limit it could nearly reach, I say this is vague and sounds true but not important.
(I have that as somewhat of a terminal value, as far as I can tell.)
Then you and I do not share terminal values. I value being “right” because it is “right”. I disvalue being “wrong” because it is wrong. These are practically tautological. :)
Well… there is significance in judging the ability of an arbitrary individual from a given culture to achieve excellence in either quality by examining the availability of the other, especially when attempting to compare the ‘greatness’ of their achievements. But that has less to do with the hardness of the limits and more to do with the strength of correlation. (With the caveat that the correlations are only statistical; individuals can and do violate those correlations quite frequently—a testament to how skilled human beings are at being inherently contradictory.)
The body of knowledge that today comprises cognitive science and behavioral economics is something our predecessors of a century ago did not have. I should expect, as a result of this—should the information be widely disseminated (with fidelity) over time, to see something equivalent to the Flynn Effect in terms of what Eliezer calls the “sanity waterline”. (With people like Cialdini and Ariely newly entering into the arena of the ‘grand marketplace of ideas’, we might see a superior result to that goal than the folks at Snopes have achieved with their individual/piecemeal approach.)
Correspondence of beliefs to reality being desirable is no closer to being a tautology than financial institutes being on the side of rivers, undercover spies digging tunnels in the ground, or spectacles being drinking vessels.
If I had said something that meant something loosely correlated to this, your point would be valid. Instead, what I said was: “I value being right because it is right; I disvalue being wrong because it is wrong.”
Your first paragraph here seems completely unnecessary. Your next paragraph walks back the claim I thought you were making about “cultivated” rationality. If you just used the wrong word before, that seems understandable, but you shouldn’t get snippy when people can’t read your mind.
The paragraph beginning “Sounds false” seems so ungrammatical that I can’t tell what it means other than ‘Science good.’
I disagree. Strongly. LW-ers in general need to recall that a linguistically phrased statement is not “more wrong” by nature than a mathematically phrased one—especially when the topic itself does not lend itself to such a thing. (Though not a direct correlation this maps well to the notions of Fake Utility Functions.)
Made up numbers are worse than ‘vague’—they are counter-productive: they “prime” the reader (and the writer) towards specific subsets of the available space.
It wasn’t the wrong word at all. It was exactly the right word. All I did was flush it out further. That would be why I re-used the same exact term: “you will be unable to fully mature to the limits of what humans can achieve in terms of cultivated rationality.”.
… Is it really so hard to not follow such a simple request? “Please spare us references to inferential gaps”
This wasn’t a question of mind reading but rather of not being profoundly ignorant as to the topic upon which truth/false judgments were being exercised. I explained this in depth and expressed my disappointment with this current state of affairs.
… It’s called the conversational tone. You use it, in writing, when you are holding a conversation.
I have no reaction to this other than contempt. If I had meant to say “science good” I would have said it. I was quoting the guy I was responding to. Just why, pray tell, is this such a difficult concept to grasp? (I will point out that my emotional reactions here indicate to me that you had no serious intention of conveying anything with this statement other than to take a petty swing at someone whose tone you disliked because it was mean, and “mean is bad”. This site would do a great deal better if people would grow out of White Knighting.)
flesh it out