The division between ‘us’ and ‘the common man’ is along the lines of raw intelligence and education in the broad sense (not the narrow academic sense). I am not comfortable with it myself but there seems to be evidence that there is some barrier to entry, especially if you want to contribute to advancing the state of the art. (I cannot locate the article where Eliezer set out the requirements for someone helping to program the friendly AI)
Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least. I should perhaps make this more clear in the article or choose another term.
Is this what you were looking for? If so, I don’t think it should be included in your post. Eliezer was talking about being a Seed AI programmer, not a rational thinker. You certainly don’t have to be a supergenius to try to improve your own rationality.
I don’t think Eliezer’s requirements have been revised to anything significantly (relative to the general population) more inclusive. Not making a negative judgement on this, he may well be right to do so. But I’m fairly confident this is the case.
Thanks. As I said, this is a barrier to contributing on the cutting edge. More appropriate however is the this article by Louie, citing IQ 130+ and an “NT” (Rational) MBTI as prerequisites for understanding the sequences
Except that, really, there is no evidence presented there that you need either of these prerequisites to understand the sequences. They’re just criteria that Louie arbitrarily decided were important.
I think he’d argue the LW reader surveys that show a concentration on the extreme upper region of the intelligence spectrum justify his claim.
Now, I think that the people willing and able to comprehend sequences are fewer than the people willing and able to comprehend rationality, but the question is how much Return On Investment there is to be had in working to reach more and more of the people in the second group who are not in the first.
Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least.
Then perhaps the word you want is “skepticism” rather than “rationalism”. Rationalists, as I understand it, do not define themselves by a worldview (and definitely not a worldview ‘informed’ by doctrines and findings). An atheism embraced so as to become a member of a community is as anathema to a rationalist as would be a theism embraced for the same reason.
Alas, skepticism fits even less, as it is merely an outlook. In this community however, atheism is treated as an open-and-shut issue and I suspect most would say that they expect a rational person after considering the evidence on both sides to come down on the side of atheism. After all, the latest survey showed that LWers willing to fill in a survey were 80% atheist. Perhaps I should clarify that I mean weak (no belief a god exists) atheism, not strong (belief no god exists) atheism.
Regardless, nothing should be ‘embraced so as to become member of a community’, including vanilla rationality (scientific method? bayes?). That is a fundamental conflict of interest that all communities face and in may cases are destroyed by. This is exactly the reason why things like the ‘existential risk career network’ scare me quite a bit, especially if they become known as ways to get a lucrative job.
I’m not sure I understand this. Could you clarify? Are you saying that a true Bayesian doesn’t think there is a distinction? That a wise Bayesian will be neither kind of atheist?
So Bayesian epistemology doesn’t actually make use of the word ‘belief’, instead we just assign probabilities to hypotheses. You don’t believe or not believe, you just estimate p. So the distinction isn’t really intelligible. I guess one could interpret weak atheist as implying a higher probability of God’s existence than a strong atheist… but it doesn’t obviously translate that way and isn’t something a Bayesian would say.
I suppose someone could claim that a strong atheist actually sets P(God) = 0. Whereas a weak atheist sets P(God) = some small epsilon. But then a Bayesian shouldn’t become a strong atheist.
I’m not sure I understand this. Could you clarify?
I’m not looking to start an argument here. I don’t need to hear reasons.
I just want to know what Jack meant when he responded to “Perhaps I should clarify …” with “Not if you are going to endorse Bayes.”
Thanks for the feedback.
The division between ‘us’ and ‘the common man’ is along the lines of raw intelligence and education in the broad sense (not the narrow academic sense). I am not comfortable with it myself but there seems to be evidence that there is some barrier to entry, especially if you want to contribute to advancing the state of the art. (I cannot locate the article where Eliezer set out the requirements for someone helping to program the friendly AI)
Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least. I should perhaps make this more clear in the article or choose another term.
Is this what you were looking for? If so, I don’t think it should be included in your post. Eliezer was talking about being a Seed AI programmer, not a rational thinker. You certainly don’t have to be a supergenius to try to improve your own rationality.
Not to mention, that piece is years (not sure how many) out of date.
I don’t think Eliezer’s requirements have been revised to anything significantly (relative to the general population) more inclusive. Not making a negative judgement on this, he may well be right to do so. But I’m fairly confident this is the case.
Thanks. As I said, this is a barrier to contributing on the cutting edge. More appropriate however is the this article by Louie, citing IQ 130+ and an “NT” (Rational) MBTI as prerequisites for understanding the sequences
Except that, really, there is no evidence presented there that you need either of these prerequisites to understand the sequences. They’re just criteria that Louie arbitrarily decided were important.
I think he’d argue the LW reader surveys that show a concentration on the extreme upper region of the intelligence spectrum justify his claim.
Now, I think that the people willing and able to comprehend sequences are fewer than the people willing and able to comprehend rationality, but the question is how much Return On Investment there is to be had in working to reach more and more of the people in the second group who are not in the first.
Then perhaps the word you want is “skepticism” rather than “rationalism”. Rationalists, as I understand it, do not define themselves by a worldview (and definitely not a worldview ‘informed’ by doctrines and findings). An atheism embraced so as to become a member of a community is as anathema to a rationalist as would be a theism embraced for the same reason.
Alas, skepticism fits even less, as it is merely an outlook. In this community however, atheism is treated as an open-and-shut issue and I suspect most would say that they expect a rational person after considering the evidence on both sides to come down on the side of atheism. After all, the latest survey showed that LWers willing to fill in a survey were 80% atheist. Perhaps I should clarify that I mean weak (no belief a god exists) atheism, not strong (belief no god exists) atheism.
Regardless, nothing should be ‘embraced so as to become member of a community’, including vanilla rationality (scientific method? bayes?). That is a fundamental conflict of interest that all communities face and in may cases are destroyed by. This is exactly the reason why things like the ‘existential risk career network’ scare me quite a bit, especially if they become known as ways to get a lucrative job.
Not if you’re going to endorse Bayes in the next sentence you shouldn’t :-)
I’m not sure I understand this. Could you clarify? Are you saying that a true Bayesian doesn’t think there is a distinction? That a wise Bayesian will be neither kind of atheist?
So Bayesian epistemology doesn’t actually make use of the word ‘belief’, instead we just assign probabilities to hypotheses. You don’t believe or not believe, you just estimate p. So the distinction isn’t really intelligible. I guess one could interpret weak atheist as implying a higher probability of God’s existence than a strong atheist… but it doesn’t obviously translate that way and isn’t something a Bayesian would say.
Got it. Thx.
I suppose someone could claim that a strong atheist actually sets P(God) = 0. Whereas a weak atheist sets P(God) = some small epsilon. But then a Bayesian shouldn’t become a strong atheist.
See here.
I’m not sure I understand this. Could you clarify?
I’m not looking to start an argument here. I don’t need to hear reasons. I just want to know what Jack meant when he responded to “Perhaps I should clarify …” with “Not if you are going to endorse Bayes.”
As to whether “skepticism” names a worldview, an outlook, or some pieces of a methodology—apparently there is some current controversy on that.