(This is in the same general area as casebash’s two suggestions, but I think it’s different enough to be worth calling out separately.)
Most of the material on LW is about individual rationality: How can I think more clearly, approximate the truth better, achieve my goals? But an awful lot of what happens in the world is done not by individuals but by groups. Sometimes a single person is solely responsible for the group’s aims and decision-making, in which case their individual rationality is what matters, but often not. How can we get better at group rationality?
(Some aspects of this will likely be better explored for commercial gain than for individual rationality, since many businesses have ample resources and strong motivation to spend them if the ROI is good; I bet there are any number of groups out there offering training in brainstorming and project planning, for instance. But I bet there’s plenty of underexplored group-rationality memespace.)
One simple idea is to make a list of people who seem individually rational to you, and ask them what are their areas of expertise. Then, if you have a question related to the area, ask them. (An equivalent of “use google” or “ask at StackExchange”, but perhaps better for questions when there is a lot of misinformation out there, or where your question would be dismissed as “too open” on SE, or where you want to find out about your unknown unknowns, etc.) If people start doing this regularly, then having an expert in the group will automatically increase the whole group’s expertise. Most people don’t mind talking about their hobbies; but with rationalists you may get the extra advantage of them telling you “actually, I don’t know” when they happen to not know.
For instrumental rationality, find a group of people who actually want to improve at instrumental rationality (as opposed to people who merely visit LW to kill time), and create a private discussion. It’s better if you can also see each other in real life, for example at meetups.
Robin Hanson would probably recommend having an internal prediction market and using it frequently. But that can create perverse incentives if you bet on stuff you can influence. (Maybe there is a way to fix this, but that needs to be considered specifically. You want a situation where people can benefit from helping a project, but not from sabotating it. Like, when you believe the project will fail, the optimal strategy would be to abstain from voting on it, not to vote against it. But the people who voted for the project would still lose their points if the project fails. It’s just that no one can gain points from a failing project.) It would be probably more useful when the group gets larger, so that people can bet on things they personally don’t influence.
(This is in the same general area as casebash’s two suggestions, but I think it’s different enough to be worth calling out separately.)
Most of the material on LW is about individual rationality: How can I think more clearly, approximate the truth better, achieve my goals? But an awful lot of what happens in the world is done not by individuals but by groups. Sometimes a single person is solely responsible for the group’s aims and decision-making, in which case their individual rationality is what matters, but often not. How can we get better at group rationality?
(Some aspects of this will likely be better explored for commercial gain than for individual rationality, since many businesses have ample resources and strong motivation to spend them if the ROI is good; I bet there are any number of groups out there offering training in brainstorming and project planning, for instance. But I bet there’s plenty of underexplored group-rationality memespace.)
One simple idea is to make a list of people who seem individually rational to you, and ask them what are their areas of expertise. Then, if you have a question related to the area, ask them. (An equivalent of “use google” or “ask at StackExchange”, but perhaps better for questions when there is a lot of misinformation out there, or where your question would be dismissed as “too open” on SE, or where you want to find out about your unknown unknowns, etc.) If people start doing this regularly, then having an expert in the group will automatically increase the whole group’s expertise. Most people don’t mind talking about their hobbies; but with rationalists you may get the extra advantage of them telling you “actually, I don’t know” when they happen to not know.
For instrumental rationality, find a group of people who actually want to improve at instrumental rationality (as opposed to people who merely visit LW to kill time), and create a private discussion. It’s better if you can also see each other in real life, for example at meetups.
Robin Hanson would probably recommend having an internal prediction market and using it frequently. But that can create perverse incentives if you bet on stuff you can influence. (Maybe there is a way to fix this, but that needs to be considered specifically. You want a situation where people can benefit from helping a project, but not from sabotating it. Like, when you believe the project will fail, the optimal strategy would be to abstain from voting on it, not to vote against it. But the people who voted for the project would still lose their points if the project fails. It’s just that no one can gain points from a failing project.) It would be probably more useful when the group gets larger, so that people can bet on things they personally don’t influence.