But I’m inferring something from it in context that you perhaps don’t mean, and I’d like to clarify. (Assuming you even read comments from this far back.)
Specific example: a couple of months after you posted this, I suffered a brain aneurysm that significantly impaired my working memory, to the point where even elementary logic problems—the sort that currently would barely register as problems that needed solving in the first place—required me to laboriously work them out with paper and pen. (Well, marker… my fine motor control was shot, also.)
The question arises: could I have passed this initiation ceremony?
I certainly could not have given the right answer. It would have been a challenge to repeat the problem, let alone solve it, in a verbal examination. My reply would in fact have been “I’m not sure. May I have a pen and paper?”
If the guide replies more or less as you do here, then I fail.
I draw attention to two possibilities in that scenario:
(A) This is a legitimate test of rationality, and I failed it. I simply was less rational while brain-damaged, even though it didn’t seem that way to me. That sort of thing does happen, after all.
(B) This test is confounding rationality with the ability to do mental arithmetic reliably. I was no less rational then than I am now.
If A is true, then you and the guide would be correct to exclude me from the club, and all is as it should be.
But if B is true, doing so would be an error. Not because it wasn’t fair, or wasn’t my fault, or anything like that, but because you’d be trying to build a group of rationalists while in fact excluding rationalists based on an irrelevant criterion.
Now, perhaps the error is negligible. Maybe the Bayesian Conspiracy will collect enough of the most rational minds of each generation that it’s not worth giving up the benefits of in-group formation to attract the remainder.
OTOH, maybe not… in which case the Bayesian Conspiracy is on the wrong track.
(nods) Fair enough. Not knowing the goals, I’m in no position to judge this fictional selection procedure… I’d have to read more stories set in this world to be entitled to an opinion there.
Trivially, if what they want is a group that is good at mental arithmetic and resisting social pressure, they’re going about it in a reasonable way.
More broadly, if they aren’t claiming that their initiation procedure preferentially selects rationalists, then my concern becomes irrelevant.
Nobody else seems to have added this response, so I will. We don’t know that this moment, in the ritual room, is the only test they undergo. Perhaps one’s ability to take a written exam is part of the public procedures. Perhaps a great open exam where anyone who wants to can sit it, running near continuously, is the first stage, and Brennan has had months in a cloisterlike environment in the public secret face of the conspiracy where the people who can study sciences but not generate new true science study?
Perhaps I missed yours? Rationality requires the ability to challenge social pressure, certainly. Are you questioning whether this procedure picks rationalists from nonrationalists? If so, and on its own, I don’t argue that it would, just that it would probably be one member of a larger set of tests.
Thinking about it more now… yes, I was implicitly assuming that failing any of the tests barred further progress, and you’re right that this wasn’t actually said. I stand corrected; thanks for pointing that out.
Do we know that saying “I don’t know” is a failure? Clearly accepting the one-sixth answer given by the guide would be a failure, and stubbornly sticking to a different wrong answer is probably a failure as well, but saying “I need more time and equipment to figure this out” might very well be tolerated.
Well, right, that’s essentially the question I was asking the author of the piece.
This comment sure does seem to suggest that no, requesting more time and equipment is a failure… but no, I don’t know one way or the other, which is why I asked.
Of course this is true, as far as it goes.
But I’m inferring something from it in context that you perhaps don’t mean, and I’d like to clarify. (Assuming you even read comments from this far back.)
Specific example: a couple of months after you posted this, I suffered a brain aneurysm that significantly impaired my working memory, to the point where even elementary logic problems—the sort that currently would barely register as problems that needed solving in the first place—required me to laboriously work them out with paper and pen. (Well, marker… my fine motor control was shot, also.)
The question arises: could I have passed this initiation ceremony?
I certainly could not have given the right answer. It would have been a challenge to repeat the problem, let alone solve it, in a verbal examination. My reply would in fact have been “I’m not sure. May I have a pen and paper?”
If the guide replies more or less as you do here, then I fail.
I draw attention to two possibilities in that scenario:
(A) This is a legitimate test of rationality, and I failed it. I simply was less rational while brain-damaged, even though it didn’t seem that way to me. That sort of thing does happen, after all.
(B) This test is confounding rationality with the ability to do mental arithmetic reliably. I was no less rational then than I am now.
If A is true, then you and the guide would be correct to exclude me from the club, and all is as it should be.
But if B is true, doing so would be an error. Not because it wasn’t fair, or wasn’t my fault, or anything like that, but because you’d be trying to build a group of rationalists while in fact excluding rationalists based on an irrelevant criterion.
Now, perhaps the error is negligible. Maybe the Bayesian Conspiracy will collect enough of the most rational minds of each generation that it’s not worth giving up the benefits of in-group formation to attract the remainder.
OTOH, maybe not… in which case the Bayesian Conspiracy is on the wrong track.
They aren’t trying to build a group of rationalists. They are trying to build a group of people who can achieve certain goals.
(nods) Fair enough. Not knowing the goals, I’m in no position to judge this fictional selection procedure… I’d have to read more stories set in this world to be entitled to an opinion there.
Trivially, if what they want is a group that is good at mental arithmetic and resisting social pressure, they’re going about it in a reasonable way.
More broadly, if they aren’t claiming that their initiation procedure preferentially selects rationalists, then my concern becomes irrelevant.
Nobody else seems to have added this response, so I will. We don’t know that this moment, in the ritual room, is the only test they undergo. Perhaps one’s ability to take a written exam is part of the public procedures. Perhaps a great open exam where anyone who wants to can sit it, running near continuously, is the first stage, and Brennan has had months in a cloisterlike environment in the public secret face of the conspiracy where the people who can study sciences but not generate new true science study?
I assume that there are other tests involved, both before and after, but I don’t see the relevance of that. I may be missing your point.
Perhaps I missed yours? Rationality requires the ability to challenge social pressure, certainly. Are you questioning whether this procedure picks rationalists from nonrationalists? If so, and on its own, I don’t argue that it would, just that it would probably be one member of a larger set of tests.
Thinking about it more now… yes, I was implicitly assuming that failing any of the tests barred further progress, and you’re right that this wasn’t actually said. I stand corrected; thanks for pointing that out.
Do we know that saying “I don’t know” is a failure? Clearly accepting the one-sixth answer given by the guide would be a failure, and stubbornly sticking to a different wrong answer is probably a failure as well, but saying “I need more time and equipment to figure this out” might very well be tolerated.
Well, right, that’s essentially the question I was asking the author of the piece.
This comment sure does seem to suggest that no, requesting more time and equipment is a failure… but no, I don’t know one way or the other, which is why I asked.