imagine two worlds, one where everyone spends a huge chunk of their time trying to find and lay claim to the next big heavy-tailed thing, and one where a small number of people do that and the rest all randomly assign themselves to some narrow craft to perfect. It seems to me that the second world results in the big heavy-tailed things being done better in the end, even if many individuals missed out by being captured in the wrong niche.
I agree with this, and it seems to be one of my biggest procedural disagreements with EA and rationality. I’d love to hear some really gears-based arguments against it.
Which do you agree would be better? I’m assuming the latter, but correct me if I’m wrong.
I haven’t thought this through, but a potential argument against: 1) agreement / alignment on what the heavy-tail problems are and their relative weight is a necessary condition for the latter to be a better strategy 2) neither this community, let alone the broader society have that thus 3) we should still focus on correctness overall.
That does reflect my own thinking about these things.
I agree with this, and it seems to be one of my biggest procedural disagreements with EA and rationality. I’d love to hear some really gears-based arguments against it.
I can definitely see how a disagreement with EA can be based on this idea, but how does the rationality version work? Can you say more about that?
Which do you agree would be better? I’m assuming the latter, but correct me if I’m wrong.
I haven’t thought this through, but a potential argument against: 1) agreement / alignment on what the heavy-tail problems are and their relative weight is a necessary condition for the latter to be a better strategy 2) neither this community, let alone the broader society have that thus 3) we should still focus on correctness overall.
That does reflect my own thinking about these things.