I… didn’t mention Ender’s Game or military-setups-for-children. I’m sorry for not making that clearer and will fix in the main post. Also, I am try to do something instead of solely complaining (I’ve written more object-level posts and applied for technical-research grants for alignment).
There’s also the other part that, actually, innate intelligence is real and important and should be acknowledged and (when possible) enhanced and extended, but also not used as a cudgel against others. I honestly think that most of the bad examples “in” the rationality community are on (unfortunately-)adjacent communities like TheMotte and sometimes HackerNews, not LessWrong/EA Forum proper.
Sorry, I was more criticizing a pattern I see in the community rather than you specifically
However, basically everyone I know who takes innate intelligence as “real and important” is dumber for it. It is very liable to mode collapse into fixed mindsets, and I’ve seen this (imo) happen a lot in the rat community.
(When trying to criticize a vibe / communicate a feeling it’s more easily done with extreme language, serializing loses information. sorry.)
However, basically everyone I know who takes innate intelligence as “real and important” is dumber for it. It is very liable to mode collapse into fixed mindsets, and I’ve seen this (imo) happen a lot in the rat community.
To the extent that that this is actually true, I suspect it comes down to underrating luck as a factor, which I could definitely see as a big problem, and not understanding that general innate intelligence isn’t widely distributed (such that even selecting pretty hard for general innate intelligence will at best get you an OOM better than average, if a supergenius and a ridiculous outlier, with the real life attempts being at best 2-3x median human, and that’s being generous.)
In essence, I think general, innate intelligence is real, it matters, but compared to luck or non-intelligence factors, it’s essentially a drop in the ocean and rationalists overrate it a lot.
I disagree quite a bit with the pattern of “there’s this true thing, but everyone around me is rounding it off to something dumb and bad, so I’m just gonna shout that the original thing is not-true, in hopes people will stop rounding-it-off”.
Like, it doesn’t even sound like you think the “real and important” part is false? Maybe you’d disagree, which would obviously be the crux there, but if this describes you, keep reading:
I don’t think it’s remotely intractable to, say, write a LessWrong post that actually convinces lots of the community to actually change their mind/extrapolation/rounding-off of an idea. Yudkowsky did it (as a knowledge popularizer) by decoupling “rationality” from “cold” and “naive”. Heck, part of my point was that SSC Scott has written multiple posts doing the exact thing for the “intelligence” topic at hand!
I get that there’s people in the community, probably a lot, who are overly worried about their own IQ. So… we should have a norm of “just boringly send people links to posts about [the topic-and-hand] that we think are true”! I’m sure, if someone wrote or dug up a good post about [why not to be racist/dickish/TheMotte about innate intelligence], we should link the right people that, too.
I agree with the meta-point that extreme language is sometimes necessary (the paradigmatic example imho being Chomsky’s “justified authority” example of a parent yelling at their kid to get out of the road, assuming the yell and/or swear during it), good on you for making that decision explicit here.
I… didn’t mention Ender’s Game or military-setups-for-children. I’m sorry for not making that clearer and will fix in the main post. Also, I am try to do something instead of solely complaining (I’ve written more object-level posts and applied for technical-research grants for alignment).
There’s also the other part that, actually, innate intelligence is real and important and should be acknowledged and (when possible) enhanced and extended, but also not used as a cudgel against others. I honestly think that most of the bad examples “in” the rationality community are on (unfortunately-)adjacent communities like TheMotte and sometimes HackerNews, not LessWrong/EA Forum proper.
Sorry, I was more criticizing a pattern I see in the community rather than you specifically
However, basically everyone I know who takes innate intelligence as “real and important” is dumber for it. It is very liable to mode collapse into fixed mindsets, and I’ve seen this (imo) happen a lot in the rat community.
(When trying to criticize a vibe / communicate a feeling it’s more easily done with extreme language, serializing loses information. sorry.)
To the extent that that this is actually true, I suspect it comes down to underrating luck as a factor, which I could definitely see as a big problem, and not understanding that general innate intelligence isn’t widely distributed (such that even selecting pretty hard for general innate intelligence will at best get you an OOM better than average, if a supergenius and a ridiculous outlier, with the real life attempts being at best 2-3x median human, and that’s being generous.)
In essence, I think general, innate intelligence is real, it matters, but compared to luck or non-intelligence factors, it’s essentially a drop in the ocean and rationalists overrate it a lot.
I disagree quite a bit with the pattern of “there’s this true thing, but everyone around me is rounding it off to something dumb and bad, so I’m just gonna shout that the original thing is not-true, in hopes people will stop rounding-it-off”.
Like, it doesn’t even sound like you think the “real and important” part is false? Maybe you’d disagree, which would obviously be the crux there, but if this describes you, keep reading:
I don’t think it’s remotely intractable to, say, write a LessWrong post that actually convinces lots of the community to actually change their mind/extrapolation/rounding-off of an idea. Yudkowsky did it (as a knowledge popularizer) by decoupling “rationality” from “cold” and “naive”. Heck, part of my point was that SSC Scott has written multiple posts doing the exact thing for the “intelligence” topic at hand!
I get that there’s people in the community, probably a lot, who are overly worried about their own IQ. So… we should have a norm of “just boringly send people links to posts about [the topic-and-hand] that we think are true”! I’m sure, if someone wrote or dug up a good post about [why not to be racist/dickish/TheMotte about innate intelligence], we should link the right people that, too.
In four words: “Just send people links.”
I agree with the meta-point that extreme language is sometimes necessary (the paradigmatic example imho being Chomsky’s “justified authority” example of a parent yelling at their kid to get out of the road, assuming the yell and/or swear during it), good on you for making that decision explicit here.