Rationality is not the same as intelligence, and I’m hoping that one of the spin-offs from Less Wrong is finding less challenging ways to explain how to use the knowledge and the brains you’ve got.
Keeping an eye out for exceptions to what you think you know and considering what those exceptions might mean isn’t a complicated idea.
Neither is the idea that everyone does things for reasons which make sense to them.
Internalizing such ideas may be emotionally challenging, but that’s a different problem.
In large part I’m dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed.
I do think it’s possible to boil down the material to simpler form. There was a time when the pythagorean theorem was the pinnacle of human thought, no doubt beyond the reach of the average person. Same for Newton’s work. Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms? Perhaps culture itself is hindering people from grasping what is ultimately simple? Or maybe the newest findings of QM etc. really are beyond the reach of certain people?
There’s a difference between understanding the latest findings of QM and rationality.
I wonder to what extent people give up on thinking because of an educational system which discourages them. As far as I can tell, thinking isn’t really taught—the ability to think (and memorize and comply) is rewarded, which is a very different matter.
I think you’ve got rationality and intelligence bundled together too tightly, though I agree that there are probably thresholds of intelligence needed for particular insights.
And I’m pretty sure that one the reasons rationality has a bad rep is the unnecessary ” but you aren’t smart enough to play” attitude that sometimes comes with it.
In large part I’m dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed.
There was a recent post (sorry, no time to hunt it down) about how to evaluate new ideas that look cool so that you don’t accidentally screw up your life. This should definitely be taught as part of rationality.
Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms?
Intellectual development doesn’t seem to be a matter of time so much as it is man-hours, if that division makes sense. I suspect that if Eliezer was a psychologist and/or educator instead of a computer scientist, we would be looking at a “rationality for the everyman” project instead of SIAI / LessWrong.
So, what we need is for someone to take the problem of “how do I explain rationality to actual people with limited time” and and work at it. HP:MoR is a start (it’s explaining rationality to a subset of Harry Potter fans, at least) but it’s not set up to give the right feedback.
Rationality is not the same as intelligence, and I’m hoping that one of the spin-offs from Less Wrong is finding less challenging ways to explain how to use the knowledge and the brains you’ve got.
Keeping an eye out for exceptions to what you think you know and considering what those exceptions might mean isn’t a complicated idea.
Neither is the idea that everyone does things for reasons which make sense to them.
Internalizing such ideas may be emotionally challenging, but that’s a different problem.
In large part I’m dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed.
I do think it’s possible to boil down the material to simpler form. There was a time when the pythagorean theorem was the pinnacle of human thought, no doubt beyond the reach of the average person. Same for Newton’s work. Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms? Perhaps culture itself is hindering people from grasping what is ultimately simple? Or maybe the newest findings of QM etc. really are beyond the reach of certain people?
There’s a difference between understanding the latest findings of QM and rationality.
I wonder to what extent people give up on thinking because of an educational system which discourages them. As far as I can tell, thinking isn’t really taught—the ability to think (and memorize and comply) is rewarded, which is a very different matter.
I think you’ve got rationality and intelligence bundled together too tightly, though I agree that there are probably thresholds of intelligence needed for particular insights.
And I’m pretty sure that one the reasons rationality has a bad rep is the unnecessary ” but you aren’t smart enough to play” attitude that sometimes comes with it.
There was a recent post (sorry, no time to hunt it down) about how to evaluate new ideas that look cool so that you don’t accidentally screw up your life. This should definitely be taught as part of rationality.
Intellectual development doesn’t seem to be a matter of time so much as it is man-hours, if that division makes sense. I suspect that if Eliezer was a psychologist and/or educator instead of a computer scientist, we would be looking at a “rationality for the everyman” project instead of SIAI / LessWrong.
So, what we need is for someone to take the problem of “how do I explain rationality to actual people with limited time” and and work at it. HP:MoR is a start (it’s explaining rationality to a subset of Harry Potter fans, at least) but it’s not set up to give the right feedback.