Written communication has many advantages, but it typically does not make you actually do the exercises. Typically, one just looks briefly at the exercise, thinks “yeah, I see what they are trying to do” and then clicks another hyperlink or switches to another browser tab.
Having five minutes without internet access and with a social pressure to actually do the exercise can make people actually do the exercises they found on internet a decade ago but never tried.
Sure, everyone is different, but I would expect most people who spend a lot of time on internet to be like this. (And the people who don’t spend a lot of time on internet won’t see LifeHacker or LessWrong, unless a book form is published.)
I see MOOC’s as a big educaational improvement because of this—sure, I could get the same educational info without the MOOC structure; just by reading the field best textbooks and academic papers; but having a specific “course” with the quizzes/homework makes me actually do the excercises, which I wouldn’t have done otherwise; and the course schedule forces me to do them now, instead of postponing them for weeks/months/forever.
Absolutely true. Some people, perhaps most, don’t do the exercises. Also true that some people (myself included) do. Even if only a small percentage of people do the exercises they read about, and only some of the time, that still scales better than in-person classes.
On continuing reflection it occurs to me that there’s a third scalable technique for increasing rationality, at least in theory: software. I’ve seen a few attempts to set up tools like the calibration game in software. So far they haven’t caught on, but it might be worth exploring further, especially if this can be worked into games the way HpMOR works rationality into really gripping fiction.
Thinking back on my life, board games, card games, and D&D type RPGs helped me learn basic probability and game theory without explicitly attempting to do so. I’m not sure today’s videogames, fun as they are, have the same virtuous and useful side effects. I wonder if it would be worthwhile to develop a really gripping game that rewarded rational play and probability based reasoning and implicitly taught it.
Even if only a small percentage of people do the exercises they read about, and only some of the time, that still scales better than in-person classes.
That’s a good point . I am not sure about the numbers today: how many people read LW, how many percent of them would do the exercises, is it more than minicamp participants? But these numbers could be improved by e.g. converting the minicamps into series of online lessons.
I guess this is a great opportunity for a CFAR volunteer with video-editing skills.
The recent XCOM game, to some extent meets your criteria (a few bugs aside). Every move matters and must be carefully planned, there are very few actions you can carry out that don’t carry a chance of failure. You quickly learn when you can afford to be ambitious, and when you need to have a back up plan if things go wrong. Even better, in Ironman mode your ~30 hour save can easily be more or less ruined in a single turn if you make particularly poor choices (or get very, very unlucky) and you have no save game to resume—you have to start over from the beginning.
My experience talking to other people playing it isn’t, however, particularly promising when it comes to implicit teaching. One friend has complained every single time he’s missed a 98% chance (“it’s bullshit”), even when I pointed out that you make thousands of attacks over the course of a game and should expect to see multiple misses at those odds.
If you haven’t seen it before, this is an entertaining video series that demonstrates the salient points quickly.
“Doing the exercise” is not something that the student does alone, with the result compared to the answer key in the teacher’s edition textbook. To perform the exercises, the student needs other people with enough understanding of the subject to provide short-cycle feedback, and I don’t know anyone who can do that for themselves.
This is basically our problem. We could definitely teach the theory of, say, our Installing Habits class remotely, but we spend a lot of it troubleshooting people’s actual plans for setting up new habits and giving rapid, personalized feedback, and it’d be quite hard to build that into automated exercises.
Written communication has many advantages, but it typically does not make you actually do the exercises. Typically, one just looks briefly at the exercise, thinks “yeah, I see what they are trying to do” and then clicks another hyperlink or switches to another browser tab.
Having five minutes without internet access and with a social pressure to actually do the exercise can make people actually do the exercises they found on internet a decade ago but never tried.
Sure, everyone is different, but I would expect most people who spend a lot of time on internet to be like this. (And the people who don’t spend a lot of time on internet won’t see LifeHacker or LessWrong, unless a book form is published.)
I see MOOC’s as a big educaational improvement because of this—sure, I could get the same educational info without the MOOC structure; just by reading the field best textbooks and academic papers; but having a specific “course” with the quizzes/homework makes me actually do the excercises, which I wouldn’t have done otherwise; and the course schedule forces me to do them now, instead of postponing them for weeks/months/forever.
Absolutely true. Some people, perhaps most, don’t do the exercises. Also true that some people (myself included) do. Even if only a small percentage of people do the exercises they read about, and only some of the time, that still scales better than in-person classes.
On continuing reflection it occurs to me that there’s a third scalable technique for increasing rationality, at least in theory: software. I’ve seen a few attempts to set up tools like the calibration game in software. So far they haven’t caught on, but it might be worth exploring further, especially if this can be worked into games the way HpMOR works rationality into really gripping fiction.
Thinking back on my life, board games, card games, and D&D type RPGs helped me learn basic probability and game theory without explicitly attempting to do so. I’m not sure today’s videogames, fun as they are, have the same virtuous and useful side effects. I wonder if it would be worthwhile to develop a really gripping game that rewarded rational play and probability based reasoning and implicitly taught it.
That’s a good point . I am not sure about the numbers today: how many people read LW, how many percent of them would do the exercises, is it more than minicamp participants? But these numbers could be improved by e.g. converting the minicamps into series of online lessons.
I guess this is a great opportunity for a CFAR volunteer with video-editing skills.
The recent XCOM game, to some extent meets your criteria (a few bugs aside). Every move matters and must be carefully planned, there are very few actions you can carry out that don’t carry a chance of failure. You quickly learn when you can afford to be ambitious, and when you need to have a back up plan if things go wrong. Even better, in Ironman mode your ~30 hour save can easily be more or less ruined in a single turn if you make particularly poor choices (or get very, very unlucky) and you have no save game to resume—you have to start over from the beginning.
My experience talking to other people playing it isn’t, however, particularly promising when it comes to implicit teaching. One friend has complained every single time he’s missed a 98% chance (“it’s bullshit”), even when I pointed out that you make thousands of attacks over the course of a game and should expect to see multiple misses at those odds.
If you haven’t seen it before, this is an entertaining video series that demonstrates the salient points quickly.
“Doing the exercise” is not something that the student does alone, with the result compared to the answer key in the teacher’s edition textbook. To perform the exercises, the student needs other people with enough understanding of the subject to provide short-cycle feedback, and I don’t know anyone who can do that for themselves.
This is basically our problem. We could definitely teach the theory of, say, our Installing Habits class remotely, but we spend a lot of it troubleshooting people’s actual plans for setting up new habits and giving rapid, personalized feedback, and it’d be quite hard to build that into automated exercises.