The general rule of thumb for raw intelligence probably applies, you can damage it with unwise actions (like eating lead paint or taking up boxing), but there aren’t really any good ways to boost it beyond its natural unimpeded baseline. Good instrumental rationality can help you look out for and avoid self-sabotaging behavior, like overworking your way into burnout.
Largely, but not entirely. There are cases where evolution optimises for something different from what you want. And there are cases where the environment has changed faster than evolution can track.
If some particular method of learning can be shown, through evidence, to be an improvement long-term, then by all means go for it. But until then, your prior belief has to be that it isn’t.
Does this by extension imply that the type of instrumental rationality training advocated by LW is useless? Why, why not?
The general rule of thumb for raw intelligence probably applies, you can damage it with unwise actions (like eating lead paint or taking up boxing), but there aren’t really any good ways to boost it beyond its natural unimpeded baseline. Good instrumental rationality can help you look out for and avoid self-sabotaging behavior, like overworking your way into burnout.
Decreasing work-load when you feel tired—the thing you naturally want to do—is also a reliable way to avoid burnout.
Largely, but not entirely. There are cases where evolution optimises for something different from what you want. And there are cases where the environment has changed faster than evolution can track.
Evolution always optimizes for the same thing :-/
If you want something different, that’s your problem :-D
Is it time to restart the “Read the Sequences” meme?
Specifically: The Tragedy of Group Selectionism
Well, at least read the wiki entry.
If some particular method of learning can be shown, through evidence, to be an improvement long-term, then by all means go for it. But until then, your prior belief has to be that it isn’t.