Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.
Oh please no.
Overestimating the value of information, and allowing the perfect to be the enemy of the good are both common failure modes among Less Wrongers. You do not need to “assess your level” down to 16 sig figs (erm, pretend there is a unit of measurement here) along 7 different axes to put yourself on one or the other side of a binary measurement. You just need to ask: “Will listening to me talk about rationality be more likely to help someone, or hurt them?”
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly. (the people who WOULD be harmed by a badly taught self-help workshop have much worse things to worry about). The cost of failure is not that high. You do not have to have an extremely high level of confidence in your success for an attempt to be your best course of action.
There is probably an optimal amount of serious thinking that should be done before embarking on this sort of endeavor, but the vast majority of Less Wrongers are going to be on the OVER-thinking side of that plot, not the UNDER-analyzing side. Anecdotally, the very first meetup I hosted, about a dozen people came, all of whom had “been intending to post a meetup” and had been waiting until they were better rationalists, or had more information, etc. It is not surprising that this strategy did not accomplish much for them up to that point.
For that reason, at least here on LW I am going to give the exact opposite advice than you did: I really suggest that everyone try this out without thinking about it forever beforehand. Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
I promise you that you will achieve more this way.
Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
In general this is good advice. However, I disrecommend it in this specific case.
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly.
I’m not (that) worried about untrained but enthusiastic amateurs causing harm to other people, though I think this is more of a risk than you imply. I’m worried about untrained but enthusiastic amateurs causing harm to the public image of rationality, to potential future efforts along these lines, etc.
There are two failure modes here. There’s failure mode #1, where enthusiastic amateurs teach awful classes and cause some people to think less of ‘rationality’, and there’s failure mode #2 where CFAR graduates want to do cool things and don’t do them because they’re scared of failure, and a community never materializes. I think #2 is the default, and more likely, and thus worth taking more effort to avoid.
Those seem like very generalizable rationalizations for never actually doing anything.
On rationality amateurs causing harm to the public image of rationality-
They can (and DO) do this anyways. On LW, reddit, facebook, blogs, vlogs, etc etc. In fact, I would guess that an enthusiastic amateur could cause more overall harm to the movement on the internet, than running a class irl.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
What do you consider to be your worst case scenario? Worst I can come up with is “I taught a terrible workshop! Who would have thought I shouldn’t have talked about infanticide to a room full of new mothers? And that one of them made a viral video about it! I won’t be able to teach another class until everyone has forgotten it in about two years! (it is unlikely to have a significant effect on cfar or miri)”
More realistic: “Wow, that was a terrible and boring class! I bet NONE of the twenty people in the room will come back next week. I will have to find new people now.”
Neither of these seem worth the level of risk aversion you are recommending here. We are not building an FAI.
I DO recommend placing yourself on the helpful/harmful binary. Obviously, a person who so new and lacking in the relevant skills that they would cause massive harm would be in the “harmful” category. Unfortunate faux pas made by the type of people in the helpful category are unlikely to be at a large scale.
Regarding harming “potential future efforts along these lines”:
Efforts made by whom?
-By myself? Because I suspect future-me will be significantly more skilled at running classes than current-me despite lack of practice?
-By CFAR? I have never seen anything from them suggesting that other people (even amateurs) should hold off on creating communities. Quite the contrary.They have invested significant effort and resources in spreading materials and knowledge to assist meetup organizers (Writing the Meetup Guide, favoring workshop applicants who are meetup organizers for spots and scholarships, etc).This is strong evidence that CFAR WANTS people to take on leadership roles and grow their local scene. It is unlikely that this is counter to their current or future goals.
-By a yet-unknown organization? Obviously bad reasoning.
Overly generalizable to “I should never attempt anything I care about, that I don’t have an extremely high confidence in succeeding in, because Failure”
Yes, and this is a very serious problem that really shouldn’t be exacerbated any further at all.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
I don’t agree. Leaders of organizations say outrageous or harmful things all the time, social skills or no social skills.
What do you consider to be your worst case scenario?
Worst case likely scenario?
Rationality becomes karate. There are dozens or hundreds of different people claiming to teach rationality. What they actually teach varies wildly from instructor to instructor. Some groups teach effective skills; some groups teach useless skills; some groups teach actively hazardous skills. In the eyes of the general public, these groups are not distinguishable from one another—they all provide “rationality training.”
A newcomer to the field has no idea what groups are good and is not likely to find a good one. Worse, they may not even know that good and bad groups exist, and ultimately gain a degree of confidence unsuited to their skill level. It is dangerous to be half a rationalist, which many learn the hard way. Ultimately, rationality training becomes diluted or confused enough that it more or less possesses no value for the average person.
Oh please no.
Overestimating the value of information, and allowing the perfect to be the enemy of the good are both common failure modes among Less Wrongers. You do not need to “assess your level” down to 16 sig figs (erm, pretend there is a unit of measurement here) along 7 different axes to put yourself on one or the other side of a binary measurement. You just need to ask: “Will listening to me talk about rationality be more likely to help someone, or hurt them?”
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly. (the people who WOULD be harmed by a badly taught self-help workshop have much worse things to worry about). The cost of failure is not that high. You do not have to have an extremely high level of confidence in your success for an attempt to be your best course of action.
There is probably an optimal amount of serious thinking that should be done before embarking on this sort of endeavor, but the vast majority of Less Wrongers are going to be on the OVER-thinking side of that plot, not the UNDER-analyzing side. Anecdotally, the very first meetup I hosted, about a dozen people came, all of whom had “been intending to post a meetup” and had been waiting until they were better rationalists, or had more information, etc. It is not surprising that this strategy did not accomplish much for them up to that point.
For that reason, at least here on LW I am going to give the exact opposite advice than you did: I really suggest that everyone try this out without thinking about it forever beforehand. Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
I promise you that you will achieve more this way.
Agreed with this x10!
In general this is good advice. However, I disrecommend it in this specific case.
I’m not (that) worried about untrained but enthusiastic amateurs causing harm to other people, though I think this is more of a risk than you imply. I’m worried about untrained but enthusiastic amateurs causing harm to the public image of rationality, to potential future efforts along these lines, etc.
There are two failure modes here. There’s failure mode #1, where enthusiastic amateurs teach awful classes and cause some people to think less of ‘rationality’, and there’s failure mode #2 where CFAR graduates want to do cool things and don’t do them because they’re scared of failure, and a community never materializes. I think #2 is the default, and more likely, and thus worth taking more effort to avoid.
Those seem like very generalizable rationalizations for never actually doing anything.
On rationality amateurs causing harm to the public image of rationality-
They can (and DO) do this anyways. On LW, reddit, facebook, blogs, vlogs, etc etc. In fact, I would guess that an enthusiastic amateur could cause more overall harm to the movement on the internet, than running a class irl.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
What do you consider to be your worst case scenario? Worst I can come up with is “I taught a terrible workshop! Who would have thought I shouldn’t have talked about infanticide to a room full of new mothers? And that one of them made a viral video about it! I won’t be able to teach another class until everyone has forgotten it in about two years! (it is unlikely to have a significant effect on cfar or miri)”
More realistic: “Wow, that was a terrible and boring class! I bet NONE of the twenty people in the room will come back next week. I will have to find new people now.”
Neither of these seem worth the level of risk aversion you are recommending here. We are not building an FAI.
I DO recommend placing yourself on the helpful/harmful binary. Obviously, a person who so new and lacking in the relevant skills that they would cause massive harm would be in the “harmful” category. Unfortunate faux pas made by the type of people in the helpful category are unlikely to be at a large scale.
Regarding harming “potential future efforts along these lines”:
Efforts made by whom?
-By myself? Because I suspect future-me will be significantly more skilled at running classes than current-me despite lack of practice?
-By CFAR? I have never seen anything from them suggesting that other people (even amateurs) should hold off on creating communities. Quite the contrary.They have invested significant effort and resources in spreading materials and knowledge to assist meetup organizers (Writing the Meetup Guide, favoring workshop applicants who are meetup organizers for spots and scholarships, etc).This is strong evidence that CFAR WANTS people to take on leadership roles and grow their local scene. It is unlikely that this is counter to their current or future goals.
-By a yet-unknown organization? Obviously bad reasoning.
Overly generalizable to “I should never attempt anything I care about, that I don’t have an extremely high confidence in succeeding in, because Failure”
Yes, and this is a very serious problem that really shouldn’t be exacerbated any further at all.
I don’t agree. Leaders of organizations say outrageous or harmful things all the time, social skills or no social skills.
Worst case likely scenario?
Rationality becomes karate. There are dozens or hundreds of different people claiming to teach rationality. What they actually teach varies wildly from instructor to instructor. Some groups teach effective skills; some groups teach useless skills; some groups teach actively hazardous skills. In the eyes of the general public, these groups are not distinguishable from one another—they all provide “rationality training.”
A newcomer to the field has no idea what groups are good and is not likely to find a good one. Worse, they may not even know that good and bad groups exist, and ultimately gain a degree of confidence unsuited to their skill level. It is dangerous to be half a rationalist, which many learn the hard way. Ultimately, rationality training becomes diluted or confused enough that it more or less possesses no value for the average person.