Though I lean toward agreeing with the conclusion that increased IQ would mitigate existential risk, I’ve been somewhat skeptical of the assertions you’ve previously made to that effect. This post provides some pretty reasonable support for your position.
The statement “Can I find some empirical data showing a corellation between IQ and quality of government” does make me curious about your search strategy, though. Did you specifically look for contrary evidence? Are there any other correlations with IQ (besides the old “more scientists to kill us” argument) that might directly or indirectly contribute to risk, rather than reduce it?
Kudos and karma to anyone who can dig up evidence unambiguously contradicting Roko’s hypothesis.
My search strategy was to put “IQ” “religion” etc into google scholar and google. I found no papers that suggested IQ correlates with increased religiosity. I found the reference to good governance by chance; it was a pleasant surprise.
I did not actively look for contradictory evidence.
I did not actively look for contradictory evidence.
I hate to discourage you when you’re otherwise doing quite well, but the above is a major, major error.
Due to the human tendency towards confirmation bias, it’s vastly important that you try to get a sense of the totality of the evidence, with a heavy emphasis on the evidence that contradicts your beliefs. If you have to prioritize, look for the contradicting stuff first.
I suppose if I thought anyone would do anything with this idea—like if someone said “OK, great idea, we’re going to appoint you as an advisor to the new enhancement panel”, I’d start getting very cautious and go make damn sure I wasn’t wrong.
But as the situation is … I am not particularly incentivized to do this; and others at LW will probably be better at finding evidence against this than I am.
There is a legitimate question of what errors are worth the time to avoid. Roko made a perfectly sensible statement—that it’s not his top priority right now to develop immense certitude about this proposition, but it would become a higher priority if the answer became more important. It is entirely possible to spend all of one’s time attempting to avoid error (less time necessary to eat etc. to remain alive and eradicate more error in the long run); I notice that you choose to spend a fair amount of your time making smart remarks to others here instead of doing that. Does it bother you that you are at certain times motivated to do things other than avoid some possible instances of error?
But that action is a necessary part of producing a conclusion.
Holding a belief, without first going through the stages of searching for relevant data, is a positive error—one that can be avoided by the simple expedient of not reaching a conclusion before an evaluation process is complete. That costs nothing.
Asserting a conclusion is costly, in more than one way.
Humans hold beliefs about all sorts of things based on little or no thought at all. It can’t really be avoided. It might be an open question whether one should do something about unjustified beliefs one notices one holds. And I don’t think there’s anything inherently wrong with asserting an unjustified belief.
Of course, I’m even using ‘unjustified’ above tentatively—it would be better to say “insufficiently justified for the context” in which case the problem goes away—certainly seeing what looks like a flower is sufficient justification for the belief that there is a flower, if nothing turns on it.
At each point, you may reach a conclusion with some uncertainty. You expect the conclusion (certainty) to change as you learn more. It would be an error to immediately jump to inadequate levels of certainty, but not to pronounce an uncertain conclusion.
there’s also the possibility of causality in the other direction—that good governance can raise the IQ of a population (through any number of mechanisms—better nutrition, better health care, better education, etc).
Again, finding correlation between IQ and quality of government constitutes weak evidence for the claim that increased IQ causes better government. Note that the authors of the paper made this claim too.
Though I lean toward agreeing with the conclusion that increased IQ would mitigate existential risk, I’ve been somewhat skeptical of the assertions you’ve previously made to that effect. This post provides some pretty reasonable support for your position.
The statement “Can I find some empirical data showing a corellation between IQ and quality of government” does make me curious about your search strategy, though. Did you specifically look for contrary evidence? Are there any other correlations with IQ (besides the old “more scientists to kill us” argument) that might directly or indirectly contribute to risk, rather than reduce it?
Kudos and karma to anyone who can dig up evidence unambiguously contradicting Roko’s hypothesis.
My search strategy was to put “IQ” “religion” etc into google scholar and google. I found no papers that suggested IQ correlates with increased religiosity. I found the reference to good governance by chance; it was a pleasant surprise.
I did not actively look for contradictory evidence.
I hate to discourage you when you’re otherwise doing quite well, but the above is a major, major error.
Due to the human tendency towards confirmation bias, it’s vastly important that you try to get a sense of the totality of the evidence, with a heavy emphasis on the evidence that contradicts your beliefs. If you have to prioritize, look for the contradicting stuff first.
I suppose if I thought anyone would do anything with this idea—like if someone said “OK, great idea, we’re going to appoint you as an advisor to the new enhancement panel”, I’d start getting very cautious and go make damn sure I wasn’t wrong.
But as the situation is … I am not particularly incentivized to do this; and others at LW will probably be better at finding evidence against this than I am.
You should be doing that anyway.
Interesting. Does it bother you that you are not strongly motivated to avoid error?
There is a legitimate question of what errors are worth the time to avoid. Roko made a perfectly sensible statement—that it’s not his top priority right now to develop immense certitude about this proposition, but it would become a higher priority if the answer became more important. It is entirely possible to spend all of one’s time attempting to avoid error (less time necessary to eat etc. to remain alive and eradicate more error in the long run); I notice that you choose to spend a fair amount of your time making smart remarks to others here instead of doing that. Does it bother you that you are at certain times motivated to do things other than avoid some possible instances of error?
Positive errors can be avoided by the simple expedient of not committing them. That usually carries very little cost.
I agree completely, but this doesn’t seem to be Roko’s situation: he’s simply not performing the positive action of seeking out certain evidence.
But that action is a necessary part of producing a conclusion.
Holding a belief, without first going through the stages of searching for relevant data, is a positive error—one that can be avoided by the simple expedient of not reaching a conclusion before an evaluation process is complete. That costs nothing.
Asserting a conclusion is costly, in more than one way.
Humans hold beliefs about all sorts of things based on little or no thought at all. It can’t really be avoided. It might be an open question whether one should do something about unjustified beliefs one notices one holds. And I don’t think there’s anything inherently wrong with asserting an unjustified belief.
Of course, I’m even using ‘unjustified’ above tentatively—it would be better to say “insufficiently justified for the context” in which case the problem goes away—certainly seeing what looks like a flower is sufficient justification for the belief that there is a flower, if nothing turns on it.
Not sure which sort of case Roko’s is, though.
At each point, you may reach a conclusion with some uncertainty. You expect the conclusion (certainty) to change as you learn more. It would be an error to immediately jump to inadequate levels of certainty, but not to pronounce an uncertain conclusion.
there’s also the possibility of causality in the other direction—that good governance can raise the IQ of a population (through any number of mechanisms—better nutrition, better health care, better education, etc).
Again, finding correlation between IQ and quality of government constitutes weak evidence for the claim that increased IQ causes better government. Note that the authors of the paper made this claim too.