The main problem with this scenario is that a real rationality organization wouldn’t create a test that’s made in a way where you can optimize for a high test score by training and studying hard for the test. You don’t want to get people to try to guess the teachers password.
The whole point of creating a good test is to avoid ways you can score highly in the test by studying for the test instead of engaging with real life issues. That means you can have something like a internship model and you decide based on the performance in the intership whether or not to make someone a card-carrying rationalist.
You can take alignment forum as an example of a rationalist organization with selected membership. You can get in by writing good AI risk posts on LessWrong. You can get in by having formal credentials (e.g. another institution vetted you) and likely a few other ways.
There’s no test for which you can study and train hard to get into into the alignmentforum.
Besides the alignment forum CFAR’s candidate filter or the one used in the LWCW might be other existing examples of selecting people for being rationalists.
The main problem with this scenario is that a real rationality organization wouldn’t create a test that’s made in a way where you can optimize for a high test score by training and studying hard for the test. You don’t want to get people to try to guess the teachers password.
The whole point of creating a good test is to avoid ways you can score highly in the test by studying for the test instead of engaging with real life issues. That means you can have something like a internship model and you decide based on the performance in the intership whether or not to make someone a card-carrying rationalist.
You can take alignment forum as an example of a rationalist organization with selected membership. You can get in by writing good AI risk posts on LessWrong. You can get in by having formal credentials (e.g. another institution vetted you) and likely a few other ways.
There’s no test for which you can study and train hard to get into into the alignmentforum.
Besides the alignment forum CFAR’s candidate filter or the one used in the LWCW might be other existing examples of selecting people for being rationalists.