My prior intuitive guess would be that H1 seems quite a decent chunk more likely than H2 or H3.
Actually I changed my mind.
Why I thought this before: H1 seems like a potential runaway-process and is clearly about individual selection which has stronger effects than group selection (and it was mentioned in HPMoR).
Why I don’t think this anymore:
It would also be incredibly huge coincidence if intelligence mostly evolved because of social dynamics but happened to be useful for all sorts of other survival techniques hunters and gatherers use. See e.g. Scott Alexander’s Book review of “The Secret of our success”.
If there was only individual benefits for intelligence but it was not very useful otherwise then over long timelines group selection[1] would actually select against smarter humans because their neurons would use up more metabolic energy.
However, there’s a possibly very big piece of evidence for H3: Humans are both the smartest land animals and have the best interface for using tools, and that would seem like a suspicious coincidence.
I think this is not a coincidence but rather that tool use let humans fall into an attractor basin where payoffs of intelligence were more significant.
Actually I changed my mind.
Why I thought this before: H1 seems like a potential runaway-process and is clearly about individual selection which has stronger effects than group selection (and it was mentioned in HPMoR).
Why I don’t think this anymore:
It would also be incredibly huge coincidence if intelligence mostly evolved because of social dynamics but happened to be useful for all sorts of other survival techniques hunters and gatherers use. See e.g. Scott Alexander’s Book review of “The Secret of our success”.
If there was only individual benefits for intelligence but it was not very useful otherwise then over long timelines group selection[1] would actually select against smarter humans because their neurons would use up more metabolic energy.
I think this is not a coincidence but rather that tool use let humans fall into an attractor basin where payoffs of intelligence were more significant.
I mean group selection that could potentially be on a level of species where species go extinct. Please lmk if that’s actually called differently.