I did not actively look for contradictory evidence.
I hate to discourage you when you’re otherwise doing quite well, but the above is a major, major error.
Due to the human tendency towards confirmation bias, it’s vastly important that you try to get a sense of the totality of the evidence, with a heavy emphasis on the evidence that contradicts your beliefs. If you have to prioritize, look for the contradicting stuff first.
I suppose if I thought anyone would do anything with this idea—like if someone said “OK, great idea, we’re going to appoint you as an advisor to the new enhancement panel”, I’d start getting very cautious and go make damn sure I wasn’t wrong.
But as the situation is … I am not particularly incentivized to do this; and others at LW will probably be better at finding evidence against this than I am.
There is a legitimate question of what errors are worth the time to avoid. Roko made a perfectly sensible statement—that it’s not his top priority right now to develop immense certitude about this proposition, but it would become a higher priority if the answer became more important. It is entirely possible to spend all of one’s time attempting to avoid error (less time necessary to eat etc. to remain alive and eradicate more error in the long run); I notice that you choose to spend a fair amount of your time making smart remarks to others here instead of doing that. Does it bother you that you are at certain times motivated to do things other than avoid some possible instances of error?
But that action is a necessary part of producing a conclusion.
Holding a belief, without first going through the stages of searching for relevant data, is a positive error—one that can be avoided by the simple expedient of not reaching a conclusion before an evaluation process is complete. That costs nothing.
Asserting a conclusion is costly, in more than one way.
Humans hold beliefs about all sorts of things based on little or no thought at all. It can’t really be avoided. It might be an open question whether one should do something about unjustified beliefs one notices one holds. And I don’t think there’s anything inherently wrong with asserting an unjustified belief.
Of course, I’m even using ‘unjustified’ above tentatively—it would be better to say “insufficiently justified for the context” in which case the problem goes away—certainly seeing what looks like a flower is sufficient justification for the belief that there is a flower, if nothing turns on it.
At each point, you may reach a conclusion with some uncertainty. You expect the conclusion (certainty) to change as you learn more. It would be an error to immediately jump to inadequate levels of certainty, but not to pronounce an uncertain conclusion.
I hate to discourage you when you’re otherwise doing quite well, but the above is a major, major error.
Due to the human tendency towards confirmation bias, it’s vastly important that you try to get a sense of the totality of the evidence, with a heavy emphasis on the evidence that contradicts your beliefs. If you have to prioritize, look for the contradicting stuff first.
I suppose if I thought anyone would do anything with this idea—like if someone said “OK, great idea, we’re going to appoint you as an advisor to the new enhancement panel”, I’d start getting very cautious and go make damn sure I wasn’t wrong.
But as the situation is … I am not particularly incentivized to do this; and others at LW will probably be better at finding evidence against this than I am.
You should be doing that anyway.
Interesting. Does it bother you that you are not strongly motivated to avoid error?
There is a legitimate question of what errors are worth the time to avoid. Roko made a perfectly sensible statement—that it’s not his top priority right now to develop immense certitude about this proposition, but it would become a higher priority if the answer became more important. It is entirely possible to spend all of one’s time attempting to avoid error (less time necessary to eat etc. to remain alive and eradicate more error in the long run); I notice that you choose to spend a fair amount of your time making smart remarks to others here instead of doing that. Does it bother you that you are at certain times motivated to do things other than avoid some possible instances of error?
Positive errors can be avoided by the simple expedient of not committing them. That usually carries very little cost.
I agree completely, but this doesn’t seem to be Roko’s situation: he’s simply not performing the positive action of seeking out certain evidence.
But that action is a necessary part of producing a conclusion.
Holding a belief, without first going through the stages of searching for relevant data, is a positive error—one that can be avoided by the simple expedient of not reaching a conclusion before an evaluation process is complete. That costs nothing.
Asserting a conclusion is costly, in more than one way.
Humans hold beliefs about all sorts of things based on little or no thought at all. It can’t really be avoided. It might be an open question whether one should do something about unjustified beliefs one notices one holds. And I don’t think there’s anything inherently wrong with asserting an unjustified belief.
Of course, I’m even using ‘unjustified’ above tentatively—it would be better to say “insufficiently justified for the context” in which case the problem goes away—certainly seeing what looks like a flower is sufficient justification for the belief that there is a flower, if nothing turns on it.
Not sure which sort of case Roko’s is, though.
At each point, you may reach a conclusion with some uncertainty. You expect the conclusion (certainty) to change as you learn more. It would be an error to immediately jump to inadequate levels of certainty, but not to pronounce an uncertain conclusion.