One simple idea is to make a list of people who seem individually rational to you, and ask them what are their areas of expertise. Then, if you have a question related to the area, ask them. (An equivalent of “use google” or “ask at StackExchange”, but perhaps better for questions when there is a lot of misinformation out there, or where your question would be dismissed as “too open” on SE, or where you want to find out about your unknown unknowns, etc.) If people start doing this regularly, then having an expert in the group will automatically increase the whole group’s expertise. Most people don’t mind talking about their hobbies; but with rationalists you may get the extra advantage of them telling you “actually, I don’t know” when they happen to not know.
For instrumental rationality, find a group of people who actually want to improve at instrumental rationality (as opposed to people who merely visit LW to kill time), and create a private discussion. It’s better if you can also see each other in real life, for example at meetups.
Robin Hanson would probably recommend having an internal prediction market and using it frequently. But that can create perverse incentives if you bet on stuff you can influence. (Maybe there is a way to fix this, but that needs to be considered specifically. You want a situation where people can benefit from helping a project, but not from sabotating it. Like, when you believe the project will fail, the optimal strategy would be to abstain from voting on it, not to vote against it. But the people who voted for the project would still lose their points if the project fails. It’s just that no one can gain points from a failing project.) It would be probably more useful when the group gets larger, so that people can bet on things they personally don’t influence.
One simple idea is to make a list of people who seem individually rational to you, and ask them what are their areas of expertise. Then, if you have a question related to the area, ask them. (An equivalent of “use google” or “ask at StackExchange”, but perhaps better for questions when there is a lot of misinformation out there, or where your question would be dismissed as “too open” on SE, or where you want to find out about your unknown unknowns, etc.) If people start doing this regularly, then having an expert in the group will automatically increase the whole group’s expertise. Most people don’t mind talking about their hobbies; but with rationalists you may get the extra advantage of them telling you “actually, I don’t know” when they happen to not know.
For instrumental rationality, find a group of people who actually want to improve at instrumental rationality (as opposed to people who merely visit LW to kill time), and create a private discussion. It’s better if you can also see each other in real life, for example at meetups.
Robin Hanson would probably recommend having an internal prediction market and using it frequently. But that can create perverse incentives if you bet on stuff you can influence. (Maybe there is a way to fix this, but that needs to be considered specifically. You want a situation where people can benefit from helping a project, but not from sabotating it. Like, when you believe the project will fail, the optimal strategy would be to abstain from voting on it, not to vote against it. But the people who voted for the project would still lose their points if the project fails. It’s just that no one can gain points from a failing project.) It would be probably more useful when the group gets larger, so that people can bet on things they personally don’t influence.