I think it comes down to a combination of 1) not being very confident that CFAR has the True Material yet, and 2) not being very confident in CFAR’s ability to correct misconceptions in any format other than teaching in-person workshops. That is, you might imagine that right now CFAR has some Okay Material, but that teaching it in any format other than in person risks misunderstandings where people come away with some Bad Material, and that neither of these is what we really want, which is for people to come away with the True Material, whatever that is. There’s at least historically been a sense that one of the only ways to get people to approximately come away with the True Material is for them to actually talk in person to the instructors, who maybe have some of it in their heads but not quite in the CFAR curriculum yet.
(This is based on a combination of talking to CFAR instructors and volunteering at workshops.)
It’s also worth pointing out that CFAR is incredibly talent-constrained as an organization, and that there are lots of things CFAR could do that would plausibly be a good idea and that CFAR might even endorse as plausibly a good idea but that they just don’t have the person-hours to prioritize.
I am somewhat suspicious that one of the reasons (certainly not the biggest, but one of) for the lack of these things is so they can more readily indoctrinate AI Safety as a concern.
At the mainstream workshops there is no mention of a topic anywhere in the neighborhood of AI safety anywhere in the curriculum. If it comes up at all it’s in informal conversations.
Could a possible solution be to teach new teachers?
How far is a person who “knows X” from a person who “can teach X”? I imagine that being able to teach X has essentially two requirements: First, understand X deeply—which is what we want to achieve anyway. Second, generalteaching skills, independent on X—these could be taught as a separate package; which could already be interesting for people who teach. And what you need then, is a written material containing all known things that should be considered when teaching X, and a short lesson explaining the details of it.
The plan could be approximately this:
1) We already have lessons for X, for Y, for Z—what CFAR offers to participants already.
2) Make lessons for teaching in general—and offer them to participants, too, because that is a separately valuable product.
3) Make lessons on “how to teach X” etc., each of them requiring lessons for “X” and for “general teaching” as prerequisites. These will be for volunteers wanting to help CFAR. After the lessons, have the volunteers teach X to some random audience (for a huge discount or even for free). If the volunteer does it well, let them teach X at CFAR workshops; first with some supervision and feedback, later alone.
Yep, CFAR is training new instructors (I’m one of them).
I imagine that being able to teach X has essentially two requirements: First, understand X deeply—which is what we want to achieve anyway. Second, general teaching skills, independent on X—these could be taught as a separate package; which could already be interesting for people who teach.
In the education literature, these are called content knowledge and pedagogical knowledge respectively. There is an important third class of thing called pedagogical content knowledge, which refers to specific knowledge about how to teach X.
The example Val likes to use is that if you want to teach elementary school students about division, it’s really important to know that there are two conceptually distinct kinds of division, namely equal sharing (you have 12 apples, you want to share them with 4 friends, how many apples per friend) and repeated subtraction (you have 12 apples, you have gift bags that fit 4 apples, how many bags can you make). This is not quite a fact about division, nor is it general teaching skill; it is specifically part of what you need to know to teach division.
There’s a difference between “knowing X” and having X be a default behavior. There’s also a difference between knowing X and being able to teach it to people who think differently than oneself or have different preconceptions.
I think it comes down to a combination of 1) not being very confident that CFAR has the True Material yet, and 2) not being very confident in CFAR’s ability to correct misconceptions in any format other than teaching in-person workshops. That is, you might imagine that right now CFAR has some Okay Material, but that teaching it in any format other than in person risks misunderstandings where people come away with some Bad Material, and that neither of these is what we really want, which is for people to come away with the True Material, whatever that is. There’s at least historically been a sense that one of the only ways to get people to approximately come away with the True Material is for them to actually talk in person to the instructors, who maybe have some of it in their heads but not quite in the CFAR curriculum yet.
(This is based on a combination of talking to CFAR instructors and volunteering at workshops.)
It’s also worth pointing out that CFAR is incredibly talent-constrained as an organization, and that there are lots of things CFAR could do that would plausibly be a good idea and that CFAR might even endorse as plausibly a good idea but that they just don’t have the person-hours to prioritize.
At the mainstream workshops there is no mention of a topic anywhere in the neighborhood of AI safety anywhere in the curriculum. If it comes up at all it’s in informal conversations.
Could a possible solution be to teach new teachers?
How far is a person who “knows X” from a person who “can teach X”? I imagine that being able to teach X has essentially two requirements: First, understand X deeply—which is what we want to achieve anyway. Second, general teaching skills, independent on X—these could be taught as a separate package; which could already be interesting for people who teach. And what you need then, is a written material containing all known things that should be considered when teaching X, and a short lesson explaining the details of it.
The plan could be approximately this:
1) We already have lessons for X, for Y, for Z—what CFAR offers to participants already.
2) Make lessons for teaching in general—and offer them to participants, too, because that is a separately valuable product.
3) Make lessons on “how to teach X” etc., each of them requiring lessons for “X” and for “general teaching” as prerequisites. These will be for volunteers wanting to help CFAR. After the lessons, have the volunteers teach X to some random audience (for a huge discount or even for free). If the volunteer does it well, let them teach X at CFAR workshops; first with some supervision and feedback, later alone.
Yep, CFAR is training new instructors (I’m one of them).
In the education literature, these are called content knowledge and pedagogical knowledge respectively. There is an important third class of thing called pedagogical content knowledge, which refers to specific knowledge about how to teach X.
The example Val likes to use is that if you want to teach elementary school students about division, it’s really important to know that there are two conceptually distinct kinds of division, namely equal sharing (you have 12 apples, you want to share them with 4 friends, how many apples per friend) and repeated subtraction (you have 12 apples, you have gift bags that fit 4 apples, how many bags can you make). This is not quite a fact about division, nor is it general teaching skill; it is specifically part of what you need to know to teach division.
If you look at their staff page you find they’ve recently trained like 10 new teachers.
Sounds like great news. Or perhaps, an experiment with potentially high value.
There’s a difference between “knowing X” and having X be a default behavior. There’s also a difference between knowing X and being able to teach it to people who think differently than oneself or have different preconceptions.
Promising pilot projects often don’t scale