One possible way to address this that’s under discussion is adding (for example) an hour right after lunch which is “quiet time” where folks are encouraged to nap, journal, go over notes, exercise, meditate etc.
Mmm. This seems like an okay plan, but it doesn’t hit the root of the problem, which is that the marginal unit of social interaction at the workshop is high value. Someone who did take the hour to journal instead of interact with other participants would probably be making a mistake, even if they’re starting to get agitated from too much social interaction.
The only ways to make them more introvert-friendly in that sense that I can think of is to make them shorter or longer, neither of which seem like good ideas for economic reasons. Short workshops that occur regularly in one location targeted at locals- basically, the old idea of a rationality dojo- seem like it’s worth considering again, but I don’t see a way to extend that beyond SF and NY very easily.
I’m rather skeptical of the “rationality dojo” concept because regular dojos are far from a reliable training method. In my experience in the martial arts, I’ve been taught things that are critically unsafe, things that would be illegal to use in almost any real-world setting, and things that just plain don’t work. Finding good dojos is actually a fairly difficult problem.
Patterning a new training paradigm after one that fails in the majority of cases seems somewhat dubious to me. Also, the conventional dojo model is, uh, not exactly optimal for introverts.
Are you comparing it to some other training paradigm that succeeds in the majority of cases? If so, do you consider workshops to be such a paradigm, or do you have some other paradigm in mind?
I’m rather skeptical of the “rationality dojo” concept because regular dojos are far from a reliable training method.
The parts of the dojo model that I’m thinking of importing are:
Regular periodic meetings of a few hours, probably weekly or monthly.
A geographically local userbase.
Clear skill gradations and demarcations and test-based advancement.
Regular open training periods.
Basically, this seems to manifest as a skill-focused meetup with a bit more structure than normal, and possibly more cash transfers / dues than normal. Do you have warnings about those features, suggestions of other features I should think about importing, or other comments?
Short workshops that occur regularly in one location targeted at locals- basically, the old idea of a rationality dojo- seem like it’s worth considering again, but I don’t see a way to extend that beyond SF and NY very easily.
(Given that you are at a level where you will cause more good than harm) Decide that you will be a teacher instead of a student and start your own.
This has been my current endeavor here in Columbus. It is going better than expected, and has been easier than expected.
What worked for us, but is possibly generalizing from one example: First, develop a close-knit group of equals (good for discussion and socialization) . THEN (my personal aha! moment here)… recruit a whole bunch of newbies all at once (good for having organized workshops and classes). You can ask people from the first group to lead various classes for the people in the second group, so that you don’t have to do it all yourself. Note: Don’t ask people as a group. Ask specific individuals for specific workshops.
It is significantly harder to organize workshops and classes among people you think of as your approximate equals in the skill in question, even though every single individual there may prefer it. (This had been our failure mode for a while)
(@Vaniver- I know you, so I know that you personally probably already know all this. This comment is more for LWers in general who are thinking about going the “organizer” route.)
Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.
Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.
Oh please no.
Overestimating the value of information, and allowing the perfect to be the enemy of the good are both common failure modes among Less Wrongers. You do not need to “assess your level” down to 16 sig figs (erm, pretend there is a unit of measurement here) along 7 different axes to put yourself on one or the other side of a binary measurement. You just need to ask: “Will listening to me talk about rationality be more likely to help someone, or hurt them?”
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly. (the people who WOULD be harmed by a badly taught self-help workshop have much worse things to worry about). The cost of failure is not that high. You do not have to have an extremely high level of confidence in your success for an attempt to be your best course of action.
There is probably an optimal amount of serious thinking that should be done before embarking on this sort of endeavor, but the vast majority of Less Wrongers are going to be on the OVER-thinking side of that plot, not the UNDER-analyzing side. Anecdotally, the very first meetup I hosted, about a dozen people came, all of whom had “been intending to post a meetup” and had been waiting until they were better rationalists, or had more information, etc. It is not surprising that this strategy did not accomplish much for them up to that point.
For that reason, at least here on LW I am going to give the exact opposite advice than you did: I really suggest that everyone try this out without thinking about it forever beforehand. Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
I promise you that you will achieve more this way.
Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
In general this is good advice. However, I disrecommend it in this specific case.
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly.
I’m not (that) worried about untrained but enthusiastic amateurs causing harm to other people, though I think this is more of a risk than you imply. I’m worried about untrained but enthusiastic amateurs causing harm to the public image of rationality, to potential future efforts along these lines, etc.
There are two failure modes here. There’s failure mode #1, where enthusiastic amateurs teach awful classes and cause some people to think less of ‘rationality’, and there’s failure mode #2 where CFAR graduates want to do cool things and don’t do them because they’re scared of failure, and a community never materializes. I think #2 is the default, and more likely, and thus worth taking more effort to avoid.
Those seem like very generalizable rationalizations for never actually doing anything.
On rationality amateurs causing harm to the public image of rationality-
They can (and DO) do this anyways. On LW, reddit, facebook, blogs, vlogs, etc etc. In fact, I would guess that an enthusiastic amateur could cause more overall harm to the movement on the internet, than running a class irl.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
What do you consider to be your worst case scenario? Worst I can come up with is “I taught a terrible workshop! Who would have thought I shouldn’t have talked about infanticide to a room full of new mothers? And that one of them made a viral video about it! I won’t be able to teach another class until everyone has forgotten it in about two years! (it is unlikely to have a significant effect on cfar or miri)”
More realistic: “Wow, that was a terrible and boring class! I bet NONE of the twenty people in the room will come back next week. I will have to find new people now.”
Neither of these seem worth the level of risk aversion you are recommending here. We are not building an FAI.
I DO recommend placing yourself on the helpful/harmful binary. Obviously, a person who so new and lacking in the relevant skills that they would cause massive harm would be in the “harmful” category. Unfortunate faux pas made by the type of people in the helpful category are unlikely to be at a large scale.
Regarding harming “potential future efforts along these lines”:
Efforts made by whom?
-By myself? Because I suspect future-me will be significantly more skilled at running classes than current-me despite lack of practice?
-By CFAR? I have never seen anything from them suggesting that other people (even amateurs) should hold off on creating communities. Quite the contrary.They have invested significant effort and resources in spreading materials and knowledge to assist meetup organizers (Writing the Meetup Guide, favoring workshop applicants who are meetup organizers for spots and scholarships, etc).This is strong evidence that CFAR WANTS people to take on leadership roles and grow their local scene. It is unlikely that this is counter to their current or future goals.
-By a yet-unknown organization? Obviously bad reasoning.
Overly generalizable to “I should never attempt anything I care about, that I don’t have an extremely high confidence in succeeding in, because Failure”
Yes, and this is a very serious problem that really shouldn’t be exacerbated any further at all.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
I don’t agree. Leaders of organizations say outrageous or harmful things all the time, social skills or no social skills.
What do you consider to be your worst case scenario?
Worst case likely scenario?
Rationality becomes karate. There are dozens or hundreds of different people claiming to teach rationality. What they actually teach varies wildly from instructor to instructor. Some groups teach effective skills; some groups teach useless skills; some groups teach actively hazardous skills. In the eyes of the general public, these groups are not distinguishable from one another—they all provide “rationality training.”
A newcomer to the field has no idea what groups are good and is not likely to find a good one. Worse, they may not even know that good and bad groups exist, and ultimately gain a degree of confidence unsuited to their skill level. It is dangerous to be half a rationalist, which many learn the hard way. Ultimately, rationality training becomes diluted or confused enough that it more or less possesses no value for the average person.
Mmm. This seems like an okay plan, but it doesn’t hit the root of the problem, which is that the marginal unit of social interaction at the workshop is high value. Someone who did take the hour to journal instead of interact with other participants would probably be making a mistake, even if they’re starting to get agitated from too much social interaction.
The only ways to make them more introvert-friendly in that sense that I can think of is to make them shorter or longer, neither of which seem like good ideas for economic reasons. Short workshops that occur regularly in one location targeted at locals- basically, the old idea of a rationality dojo- seem like it’s worth considering again, but I don’t see a way to extend that beyond SF and NY very easily.
I’m rather skeptical of the “rationality dojo” concept because regular dojos are far from a reliable training method. In my experience in the martial arts, I’ve been taught things that are critically unsafe, things that would be illegal to use in almost any real-world setting, and things that just plain don’t work. Finding good dojos is actually a fairly difficult problem.
Patterning a new training paradigm after one that fails in the majority of cases seems somewhat dubious to me. Also, the conventional dojo model is, uh, not exactly optimal for introverts.
Are you comparing it to some other training paradigm that succeeds in the majority of cases?
If so, do you consider workshops to be such a paradigm, or do you have some other paradigm in mind?
The parts of the dojo model that I’m thinking of importing are:
Regular periodic meetings of a few hours, probably weekly or monthly.
A geographically local userbase.
Clear skill gradations and demarcations and test-based advancement.
Regular open training periods.
Basically, this seems to manifest as a skill-focused meetup with a bit more structure than normal, and possibly more cash transfers / dues than normal. Do you have warnings about those features, suggestions of other features I should think about importing, or other comments?
(Given that you are at a level where you will cause more good than harm) Decide that you will be a teacher instead of a student and start your own.
So, a vaguely similar plan is in the works here in Austin, but unless it goes spectacularly I don’t expect that to happen unless I move to SF or NY.
This has been my current endeavor here in Columbus. It is going better than expected, and has been easier than expected.
What worked for us, but is possibly generalizing from one example: First, develop a close-knit group of equals (good for discussion and socialization) . THEN (my personal aha! moment here)… recruit a whole bunch of newbies all at once (good for having organized workshops and classes). You can ask people from the first group to lead various classes for the people in the second group, so that you don’t have to do it all yourself. Note: Don’t ask people as a group. Ask specific individuals for specific workshops.
It is significantly harder to organize workshops and classes among people you think of as your approximate equals in the skill in question, even though every single individual there may prefer it. (This had been our failure mode for a while)
(@Vaniver- I know you, so I know that you personally probably already know all this. This comment is more for LWers in general who are thinking about going the “organizer” route.)
Assessing your level is extremely hard in this case (it includes instrumental rationality, epistemic rationality, teaching ability, marketing ability, etc. etc.) and I really suggest that nobody do this without thinking about it very seriously beforehand.
Oh please no.
Overestimating the value of information, and allowing the perfect to be the enemy of the good are both common failure modes among Less Wrongers. You do not need to “assess your level” down to 16 sig figs (erm, pretend there is a unit of measurement here) along 7 different axes to put yourself on one or the other side of a binary measurement. You just need to ask: “Will listening to me talk about rationality be more likely to help someone, or hurt them?”
And as much as you (generic you, not you specifically) like to believe you are playing around with edgy, dangerous ideas, you are unlikely to cause serious harm to people by teaching a self-help workshop badly. (the people who WOULD be harmed by a badly taught self-help workshop have much worse things to worry about). The cost of failure is not that high. You do not have to have an extremely high level of confidence in your success for an attempt to be your best course of action.
There is probably an optimal amount of serious thinking that should be done before embarking on this sort of endeavor, but the vast majority of Less Wrongers are going to be on the OVER-thinking side of that plot, not the UNDER-analyzing side. Anecdotally, the very first meetup I hosted, about a dozen people came, all of whom had “been intending to post a meetup” and had been waiting until they were better rationalists, or had more information, etc. It is not surprising that this strategy did not accomplish much for them up to that point.
For that reason, at least here on LW I am going to give the exact opposite advice than you did: I really suggest that everyone try this out without thinking about it forever beforehand. Don’t be afraid to do low-cost experiments. Tighten your feedback loops.
I promise you that you will achieve more this way.
Agreed with this x10!
In general this is good advice. However, I disrecommend it in this specific case.
I’m not (that) worried about untrained but enthusiastic amateurs causing harm to other people, though I think this is more of a risk than you imply. I’m worried about untrained but enthusiastic amateurs causing harm to the public image of rationality, to potential future efforts along these lines, etc.
There are two failure modes here. There’s failure mode #1, where enthusiastic amateurs teach awful classes and cause some people to think less of ‘rationality’, and there’s failure mode #2 where CFAR graduates want to do cool things and don’t do them because they’re scared of failure, and a community never materializes. I think #2 is the default, and more likely, and thus worth taking more effort to avoid.
Those seem like very generalizable rationalizations for never actually doing anything.
On rationality amateurs causing harm to the public image of rationality-
They can (and DO) do this anyways. On LW, reddit, facebook, blogs, vlogs, etc etc. In fact, I would guess that an enthusiastic amateur could cause more overall harm to the movement on the internet, than running a class irl.
The people who are likely to say EXTREMELY harmful things are extremely unlikely to be the types to decide to lead an organization (require related social skills).
What do you consider to be your worst case scenario? Worst I can come up with is “I taught a terrible workshop! Who would have thought I shouldn’t have talked about infanticide to a room full of new mothers? And that one of them made a viral video about it! I won’t be able to teach another class until everyone has forgotten it in about two years! (it is unlikely to have a significant effect on cfar or miri)”
More realistic: “Wow, that was a terrible and boring class! I bet NONE of the twenty people in the room will come back next week. I will have to find new people now.”
Neither of these seem worth the level of risk aversion you are recommending here. We are not building an FAI.
I DO recommend placing yourself on the helpful/harmful binary. Obviously, a person who so new and lacking in the relevant skills that they would cause massive harm would be in the “harmful” category. Unfortunate faux pas made by the type of people in the helpful category are unlikely to be at a large scale.
Regarding harming “potential future efforts along these lines”:
Efforts made by whom?
-By myself? Because I suspect future-me will be significantly more skilled at running classes than current-me despite lack of practice?
-By CFAR? I have never seen anything from them suggesting that other people (even amateurs) should hold off on creating communities. Quite the contrary.They have invested significant effort and resources in spreading materials and knowledge to assist meetup organizers (Writing the Meetup Guide, favoring workshop applicants who are meetup organizers for spots and scholarships, etc).This is strong evidence that CFAR WANTS people to take on leadership roles and grow their local scene. It is unlikely that this is counter to their current or future goals.
-By a yet-unknown organization? Obviously bad reasoning.
Overly generalizable to “I should never attempt anything I care about, that I don’t have an extremely high confidence in succeeding in, because Failure”
Yes, and this is a very serious problem that really shouldn’t be exacerbated any further at all.
I don’t agree. Leaders of organizations say outrageous or harmful things all the time, social skills or no social skills.
Worst case likely scenario?
Rationality becomes karate. There are dozens or hundreds of different people claiming to teach rationality. What they actually teach varies wildly from instructor to instructor. Some groups teach effective skills; some groups teach useless skills; some groups teach actively hazardous skills. In the eyes of the general public, these groups are not distinguishable from one another—they all provide “rationality training.”
A newcomer to the field has no idea what groups are good and is not likely to find a good one. Worse, they may not even know that good and bad groups exist, and ultimately gain a degree of confidence unsuited to their skill level. It is dangerous to be half a rationalist, which many learn the hard way. Ultimately, rationality training becomes diluted or confused enough that it more or less possesses no value for the average person.