OK, what if you were to, say, at the end of each day brainstorm situations during the day when skill X could have been useful in order to get better at recognizing them?
Sounds like this would still run into the problem I anticipate and be hindered by poor innate memory and pattern matching abilities or low conscientiousness. Some people just won’t recognize the situation even in retrospect or have already forgotten about it.
Here’s an example of what a less than ideal teaching scenario might look like. If MIT graduates are one end of the spectrum, that’s close to the another, and most people are going to be somewhere in between.
Could meditation be useful for this?
Meditation is definitely one of the more interesting self-improvement techniques where you basically just follow an algorithm. Still, it probably won’t increase your innate g, much like nothing else seems to. And there are some not entirely healthy subcultures around extensive meditation practices (detachment from the physical world as in “the only difference between an ideal monk and a corpse is that the monk still has a beating heart” and so on), which might be trouble for someone who really wants an algorithm to follow and grabs on to meditation without having much of a counterweight in their worldview.
“There exists no rationality curriculum such that a person of average IQ can benefit from it” and “there exists no rationality curriculum such that a person of LW-typical IQ can benefit from it” are not the same statement.
And there are some not entirely healthy subcultures around extensive meditation practices (detachment from the physical world as in “the only difference between an ideal monk and a corpse is that the monk still has a beating heart” and so on), which might be trouble for someone who really wants an algorithm to follow and grabs on to meditation without having much of a counterweight in their worldview.
shrug It sounds as though you want a rationality curriculum to fail, given that you are brainstorming this kind of creative failure mode.
Sounds like this would still run into the problem I anticipate and be hindered by poor innate memory and pattern matching abilities or low conscientiousness. Some people just won’t recognize the situation even in retrospect or have already forgotten about it.
Here’s an example of what a less than ideal teaching scenario might look like. If MIT graduates are one end of the spectrum, that’s close to the another, and most people are going to be somewhere in between.
Meditation is definitely one of the more interesting self-improvement techniques where you basically just follow an algorithm. Still, it probably won’t increase your innate g, much like nothing else seems to. And there are some not entirely healthy subcultures around extensive meditation practices (detachment from the physical world as in “the only difference between an ideal monk and a corpse is that the monk still has a beating heart” and so on), which might be trouble for someone who really wants an algorithm to follow and grabs on to meditation without having much of a counterweight in their worldview.
“There exists no rationality curriculum such that a person of average IQ can benefit from it” and “there exists no rationality curriculum such that a person of LW-typical IQ can benefit from it” are not the same statement.
shrug It sounds as though you want a rationality curriculum to fail, given that you are brainstorming this kind of creative failure mode.
I want to believe that the rationality curriculum will fail iff it is the case that the rationality curriculum will fail.