I know that this is mere anecdote; and that after doesn’t strictly imply because of. But, since the mini-camp, people who know me would probably agree that:
I am more likely to try new things; in particular, I now have the habit of trying new habits to see what works and what doesn’t. This has helped in a handful of little ways:
I’ve stopped biting my nails.
I’ve stopped drinking soda
I maintain a journal to get better information about myself
I use Anki to memorize facts, instead of just thinking it’s a good idea. This has made my work rather more efficient.
I have more time and energy for both my academic work and other activities I enjoy.
I meet people more easily, and have more friends.
To emphasize the last point, uncomfortably personally: I am no longer cripplingly unable to examine my own sexuality, ask women out, or engage in relationships. (I’m still inexperienced for my age, though this improves over time.) These changes are due to techniques I learned at mini-camp: not lessons of the form “how to pick up women”, but “how to be right about yourself”.
Also, I suspect my writing has improved.
There are also internal, mental changes; and I suspect that the rate at which my agency improves has increased. But you’d get the same report in different words from someone after a Christian brainwashing retreat, so I suppose these are pretty weak evidence for you.
Finding people who could converse at a high level about the most important topics in the world was more fulfilling than I could have imagined. You can get some of this at a meetup—and I’ve been to meetups in Chicago, St. Louis, and the Bay—but the level of fulfillment I got at the mini-camp was the greatest by far.
Again, forgetting all the rationality training—there were moments at mini-camp when everyone was hanging out and I would literally have trouble deciding where to stand in a room because every conversation going around me was so ridiculously interesting that I couldn’t stand choosing where to place myself. I felt like a wealth of knowledge was being spilt around me, and if I didn’t scramble to consume as much as possible I’d miss some lifechanging insight and regret it forever. It was so beautiful it hurt.
Again, forgetting all the rationality training—there were moments at mini-camp when everyone was hanging out and I would literally have trouble deciding where to stand in a room because every conversation going around me was so ridiculously interesting that I couldn’t stand choosing where to place myself. I felt like a wealth of knowledge was being spilt around me, and if I didn’t scramble to consume as much as possible I’d miss some lifechanging insight and regret it forever. It was so beautiful it hurt.
Can you describe the difference between a typical conversation at the mini-camp, and a typical conversation on LW? (Would it be accurate to say that you’re more impressed with the former than the latter? I’m curious to find out why if that’s the case.)
It would be accurate to say I’m more impressed with the former than the latter. I think the majority of this effect is caused by a) the conversations being in person, which is a better format than this nested Reddit thing, and b) the fact that we were together so long.
That said, the conversations were also more enjoyable and interesting than conversations I’ve had at meetups (which have often been fantastic). I’m not exactly sure why—perhaps experiencing the somewhat rigorous mini-camp generated a sense of camaraderie, and thus friendship?
After trying to adjust for the above effects, it also does seem to me that any residual difference in quality could have to do with the group that was selected. Luke did mention to me that they tried to choose a relatively extroverted set of people for the first mini-camp. Also, the level of professional success at the mini-camp was higher than most other groups I’ve been in, including meetups. (I also think the median age of the mini-camp must have been higher than the median ages of the meetups I’ve attended. At 21, I was one of the youngest there.)
So it’s more about the form of the conversations, and less about the content?
A problem I have with in-person group conversations is that I’d occasionally find that whoever is speaking is rambling or just not being as interesting as I hope, and wish there was some way to politely signal the person to make their point quickly and give someone else a turn. And then when I get a chance to speak, I’d fear that I’m not being as interesting as I had expected to be when I decided to speak up, and other people are thinking that I should stop talking.
I’m curious if other people have had this problem and how they dealt with it.
An experiment I tried once, when I was helping mediate a 60-person round-robin discussion group (1), was to give everyone in the room four colored index cards: red, blue, green, and white, and assign them meanings by convention: red = “I disagree with what the speaker is saying” green = “I agree with what the speaker is saying” blue = “I have a question about what the speaker is saying” white = “I do not care about what the speaker is saying”
My theory was that by establishing a communication channel that supported multiple simultaneous inputs, I could get the flow control to be a lot more efficient.
The experiment mostly failed, in that people didn’t use the cards, so I can’t really speak to results. It still seems plausible to me, and I haven’t seen it done elsewhere.
My theory was that by establishing a communication channel that supported multiple simultaneous inputs, I could get the flow control to be a lot more efficient.
I think people already do something like this, using facial expressions and body language. Using your cards probably felt redundant, condescending (implying the speaker can’t read the standard signals), weird, or too explicit (e.g., when you want to signal disagreement/disinterest but also want plausible deniability).
So I guess I was hoping for some tips on how to read/send the usual signals, and what to do when someone rambles on despite sending the usual signals. Another idea I just thought of is to have a smartphone app that allows one to send a covert anonymous signal to the speaker (but it would probably take too much work to get everyone to set it up and use it).
I think people already do something like this, using facial expressions and body language.
Certainly. Those mechanisms weren’t working terribly reliably in a conversation that involved 60 people, which is precisely why I’d been looking for ways to augment the normal mechanisms.
So it’s more about the form of the conversations, and less about the content?
Basically. But I think the form of the conversations leads to much better content, and more depth of exploration, and clearer / faster communication.
wish there was some way to politely signal the person to make their point quickly and give someone else a turn.
I honestly find that this is difficult. I think it’s easier to learn how to politely interrupt, or just be careful about the groups one hangs out in, or speak in smaller groups.
And then when I get a chance to speak, I’d fear that I’m not being as interesting as I had expected to be when I decided to speak up, and other people are thinking that I should stop talking.
That is interesting. I try to keep my points short, when possible. I think short points also facilitates better communication; shorter back-and-forth periods enable people to ask for the specific information they need, and closes inferential gaps.
As an attendee, my personal data might be relevant:
I have gained practice deliberately acquiring new habits and soliciting useful feedback. Before camp I had no specific plans for self-improvement other than “work harder”, and now I actually keep track of what works and what doesn’t. For instance, I am deliberately improving my public speaking skills by giving talks on Minicamp material once a week to a limited audience. I would place a bet that the “alternate universe me” who instead attended Inspirational Retreat X (IRX) would not have had lasting effects nearly a year later.
I am deliberately extending my network of contacts. Speaking to new people was a skill that I didn’t have pre-Minicamp. On this point, “alternate universe me” could have reasonably acquired similar skills from IRX, but I have relatively strong reason to believe that those skills would be much more of a black box than they are now. I usually leave workshops inspired, but I can tell if it’s a poor workshop when I try to apply the skills I learned and discover that it’s not as easy as it seemed to be according to the instructor’s examples. There is a difference between “explaining something so that it sounds good” and “explaining something so someone else can do it”. I attend swing dancing workshops about once a month, and Minicamp never once felt inapplicable like several of the lessons I’ve taken over the years. More personal data: I talked a local CEO into letting me give a presentation on rationality to the class he teaches on the side at Penn State, which is something I would have never even thought about doing before Minicamp.
This comment has already gone on too long, but I hope that gives you some useful information.
Summary: Minicamp’s general perspective on teaching skills is more effective than the vast majority of small workshops I attend because the instructors taught skills rather than inspiration. Inspiration came from trying their skills and discovering that they worked, which is surprisingly rare.
People reporting back from a Christian retreat are likely to report effects that Christians approve of—that they’re asking Jesus to help them decide in their daily life, that they feel a more full and whole relationship with God, etc. But those things (where they don’t require the existence of a God) are likely to be true—they really are doing those things.
7b) Is there any evidence I’ll be glad I went that a Christian brainwashing retreat could not produce just as easily?
If you went to a Jehovah’s Witness retreat, and were in an accident, and you were conscious enough to refuse a blood transfusion, you’d be glad for having learned what you did at the retreat, even if you knew the refusal would be fatal.
In general, anything that is compelling and affects your decisions will make you glad for it, and its being compelling is probably not inversely related to its being true. So I’m not too concerned that my tentative answer to this question is “no.”
I’m concerned, however, that the camp can’t produce evidence of the kind, “Before the minicamp, Mary Sue was in rehab for crack. A year later, she’s clean and has a successful web consultancy.” (Exaggerating the expected magnitude of change, of course.) Religious retreats don’t produce this, and tend to produce results more like, “Immediately after the retreat I felt really good, and a year later I do awesome on unobservable metrics!”
Before the bootcamp, I’d just barely managed to graduate college and didn’t have the greatest prospects for finding a job. (Though to be fair, I was moving to SF and it was a CS degree.)
At the bootcamp, I founded (and then folded) a startup with other bootcampers, which was profoundly educational and cost a couple months of time and <$100.
Now, <1 year after the bootcamp, I’m doing programming and design work on the new SimCity, which is as close to a dream job for me as could reasonably be expected to exist.
I can’t attribute all my recent success to the bootcamp, because I was pretty awesome beforehand, but it really did dramatically improve my effectiveness in a number of domains (my girlfriend is grateful for the fashion tips I picked up, for example). Other specific things I’ve found useful include meditation, value of information calculations, and rejection therapy.
“Is there evidence this will be worthwhile according to my values now, independently of how it might change my values?”
“Is there evidence that this is instrumentally useful for more than warm fuzzies?”
“Is there evidence that for the probable benefit of this event the costs are substantially optimized for it? I.e., if the benefit is substantially social, even if this would be worth flying around the world for, a program could actually be optimized for social benefits, and/or I could attend a closer/cheaper/shorter program with similar benefits to me.”
“Regardless of anyone’s intent, what is this program optimized for?”
The cooperation has actually been happening; it’s just that it was achieved by ostracizing the guy who asked if you were adhering to the principles expected of that kind.
Note that your original comment has positive and rising karma at this point. I have a high estimation of the minicamps (partially because I’m friends with fiddlemath, who really has appeared to level up since last summer in noticeable ways), but I’m glad that you’re stubborn about making SIAI/CMR show you some good evidence.
Sorry, “Christian retreat” didn’t convey the idea, and in any case I gave a link to a better explanation of the part of conceptspace I was trying to refer to. I’ll take it out since the link should suffice.
For the purpose of causal inference / intervention evaluation, you must ask if a Christian retreat would have had this effect on those participants. Perhaps Christians feel closer after a Christian event, but I find Christian events somewhat alienating because I’m not Christian. I don’t find aspiring rationalist events alienating, in part because I’m an aspiring rationalist. It’s fun to hang out with people who have common interests, and depending on who you are, that group is a different group… for me, it’s rationalists. Part of the point of the camp is that it has a similar bonding effect that any coming together of people with a deep common interest or aspiration can have, and in this case, the common aspiration is rationality.
Plus, at the camp, I did internalize skills and attitudes that have helped me a lot over the past (I.e. I’ve improved much more over the past year than I have in previous years), for example, looking more vigilantly for fungibility between my time and money, looking more at the reasons I do things and finding more effect ways to pursue those reasons...
Those particular effects I wouldn’t expect from a Christian camp, just as the particular effect of feeling close to Jesus is not an effect I’d expect from a rationality camp. I just happen to prefer the “rationality” effects, and these camps are for people with similar such preferences.
If the primary motivation for attending is the emotional rewards of meeting others with interest in rationality and feeling that you’ve learned how to be more rational, then yes, a Christian brainwashing retreat would make you glad you attended it in the same way, if and only if you are/became Christian (since non Christians likely wouldn’t enjoy a Christian brainwashing retreat.)
That said, as many of us have little/no data on changes in rationality (if any) of attendees, attending is the only real option you have to test whether it might. Confirmation bias would make a positive result weak evidence, but it’d be relatively important given the lack of other evidence. Luckily even if the retreat doesn’t have benefits to your objective level of rationality it sounds worthwhile on the undisputed emotional merits.
I think what SilasBarta is trying to ask is do we have any objective measurements yet from the previous minicamp that add weight to the hypothesis that this camp does in fact improve rationality or life achievement over either the short or long term?
If not then I’m still curious, are there any plans to attempt to study rationality of attendees and non-attendees to establish such evidence?
Right, it’s been nearly a year since the last one. The long-term evidence is out there. How are attendees doing in their lives now vs how they were doing before?
I’m pretty sure there’s been enough time to find this information out by now.
It’s hard to get objective evidence on this, because the participants were all pretty exceptional people to start off with, and there were so few of them, but there is an effort underway to collect what data we can from those that attended the longer Boot Camp—hopefully we’ll be able to report back within a month.
Do you think a religious event would have the same effect on the same people? That is, these mostly atheist people who were all very interested in science and rationality? Or do you just think that there exist people on which a religious event would a similar effect?
This is an important distinction for someone deciding whether to attend, because such a person knows whether she is religious or not.
I’m not sure you answered the question. I think SilasBarta is looking for evidence, that someone can provide for him right now, that he will be glad he went.
ETA: and for purposes of this discussion, I think a bald assertion does not fall in the cluster “evidence”.
There’s lots of statistical data already in the post about evidence that you will be glad you went. That wasn’t what Silas Barta asked, and frankly I’m not sure this thread is going to be productive given the way the opening question was framed.
How long would it take Anna to email the attendees and ask them to reply back about their current life status as compare to a year ago, so as to avoid proceeding with misleading evidence?
Edit in reply to the unmarked update to EY’s comment:
Like thomblake noted, the evidence being cited is not of the kind I asked for, which was why I framed my question with a link to Yvain’s lucid explanation of the problems of sorting out good retreats from bad. The exact evidence produced can be found just the same from (less useful) Christian brainwashing retreats, which is why I wasn’t impressed the last time around.
I do appreciate your efforts to apply the methods of rationality to your own endeavors.
Apparently the one-year followup is currently underway—a Minicamp attendee volunteered to do it.
This is pretty strong evidence of itself—people almost never volunteer for things and do them.
EDIT: OOPS! Anna said that an RBC attendee volunteered to do the RBC followup. RBC as you know was less successful than Minicamp, and we do not have someone doing the Minicamp followup yet.
I will remark that it’s more time-intensive than you seem to think—this is something that gets done after successfully hiring an executive assistant; we have a candidate but the hiring hasn’t yet occurred.
7b) Is there any evidence I’ll be glad I went that a Christian retreat could not produce just as easily?
Eliezer_Yudkowsky:
Apparently the one-year followup is currently underway—a Minicamp attendee volunteered to do it. This is pretty strong evidence of itself—people almost never volunteer for things and do them.
Yes, people usually don’t do that. On the other hand, it isn’t implausible that someone who just returned from a “Christian retreat” and who is “on fire for God” to “volunteer for things and do them”. SilasBarta isn’t merely asking for evidence that the camp provides benefits; he is asking for a reason to think it has benefits that exceed those that can be obtained at other kinds of events (specifically, a “Christian retreat”).
he is asking for a reason to think it has benefits that exceed those that can be obtained at other kinds of events
Or, rather, that exceeds that that can be so obtained. That is, SB’s7b relates to the relative quality of the reason for belief, not the relative quality of the benefits.
But you’re right that (for example) Christian retreats routinely get people to volunteer to do things and do them, so the simple fact of a Minicamp attendee doing so is not by itself strong evidence of a difference between the two events.
OTOH, there may well be sufficient differences between the two communities that the similarity of results is such evidence. That is, if event X1 gets result Y1 from a member of community Z1, while X2 gets Y2 from a member of Z2, the similarity of Y1 and Y2 given significant relevant differences between Z1 and Z2 suggests equally significant differences between X1 and X2. If Z2 is consistently more inclined to cooperate than Z1, and Y1/Y2 demonstrate willing cooperation, I conclude that X1 is more effective at inducing cooperation than X2.
(OTOOH, a lot depends on why Z2 cooperates more reliably. If it turns out that cooperation is primarily caused by the quality of Z2′s events, then that’s evidence against there being a significant difference between X1 and X2.)
Yes, after I said “After Minicamp you will be able to explain the math behind what you do”, thus answering the original question, whereupon I was directed to answer other questions instead.
Assuming this is true, do you have a good model for why this was the case?
It is certainly the case that those at RBC were exposed to more of the “rationality teaching” material than those at Minicamp, so if this is true then it should probably be worrying.
I think it was a selection effect: at Mini-Camp, the people who went were chosen from a broad pool, of anyone who could take a week off at the beginning of summer. But the only people who went to Mega-Camp were the types of people who could afford to take a whole summer off. So the Mega-Camp attendees were younger, more likely to be a student, less likely to have other things going on in their lives.
(I was a Mega-Camp attendee.)
Other potential reasons: it started to suck living & eating together in a tight, crowded space. It’s tolerable (and even fun!) for a week, but after a few weeks, privacy & space become an issue.
These are all good reasons why RBC would seem less awesome than Mini-Camp, but they aren’t actually good reasons why it should have been less effective at teaching people rationality. If anything, surely one would expect people who were younger and had less going on in their lives to benefit more from rationality training.
Basically, I agree with you that these are the reasons that Eliezer describes RBC as “less of a success”, but this just means that Silas is right, and the measure of “success” being used is “how awesome did everyone think it was”, not “how much rationality did we manage to teach”.
Yes, we have a good model for why this is the case, and it involves specific managers performing worse than others so the social cost of explaining our model is not zero.
We tried two experiments. The first one worked better, so we’re repeating that one instead of the second one.
I will remark that it’s more time-intensive than you seem to think—this is something that gets done after successfully hiring an executive assistant; we have a candidate but the hiring hasn’t yet occurred.
I’m open to the planning fallacy as anyone, but really, how long does it take to email everyone for their current life status?
I am not sure this thread is helpful, particularly given the way the opening question was framed.
I agree. But Silas is just doing his due diligence to ask that sort of question every time one of these things is mentioned, and surely that’s valuable to have around.
There’s lots of statistical data already in the post about evidence that you will be glad you went. That wasn’t what Silas Barta asked
I left out the clause “that a Christian brainwashing retreat could not produce just as easily” in my retelling, since I was just noting an additional constraint. I don’t think the sort of evidence in the post above actually satisfies that criterion.
7b) Is there any evidence I’ll be glad I went that a Christian retreat could not produce just as easily?
Edit: Okay, 15 seconds to this being downvoted was a little hasty.
I know that this is mere anecdote; and that after doesn’t strictly imply because of. But, since the mini-camp, people who know me would probably agree that:
I am more likely to try new things; in particular, I now have the habit of trying new habits to see what works and what doesn’t. This has helped in a handful of little ways:
I’ve stopped biting my nails.
I’ve stopped drinking soda
I maintain a journal to get better information about myself
I use Anki to memorize facts, instead of just thinking it’s a good idea. This has made my work rather more efficient.
I have more time and energy for both my academic work and other activities I enjoy.
I meet people more easily, and have more friends.
To emphasize the last point, uncomfortably personally: I am no longer cripplingly unable to examine my own sexuality, ask women out, or engage in relationships. (I’m still inexperienced for my age, though this improves over time.) These changes are due to techniques I learned at mini-camp: not lessons of the form “how to pick up women”, but “how to be right about yourself”.
Also, I suspect my writing has improved.
There are also internal, mental changes; and I suspect that the rate at which my agency improves has increased. But you’d get the same report in different words from someone after a Christian brainwashing retreat, so I suppose these are pretty weak evidence for you.
Hey, I’m glad to hear that :)
Finding people who could converse at a high level about the most important topics in the world was more fulfilling than I could have imagined. You can get some of this at a meetup—and I’ve been to meetups in Chicago, St. Louis, and the Bay—but the level of fulfillment I got at the mini-camp was the greatest by far.
Again, forgetting all the rationality training—there were moments at mini-camp when everyone was hanging out and I would literally have trouble deciding where to stand in a room because every conversation going around me was so ridiculously interesting that I couldn’t stand choosing where to place myself. I felt like a wealth of knowledge was being spilt around me, and if I didn’t scramble to consume as much as possible I’d miss some lifechanging insight and regret it forever. It was so beautiful it hurt.
Wow. That’s like the opposite of most parties.
Can you describe the difference between a typical conversation at the mini-camp, and a typical conversation on LW? (Would it be accurate to say that you’re more impressed with the former than the latter? I’m curious to find out why if that’s the case.)
It would be accurate to say I’m more impressed with the former than the latter. I think the majority of this effect is caused by a) the conversations being in person, which is a better format than this nested Reddit thing, and b) the fact that we were together so long.
That said, the conversations were also more enjoyable and interesting than conversations I’ve had at meetups (which have often been fantastic). I’m not exactly sure why—perhaps experiencing the somewhat rigorous mini-camp generated a sense of camaraderie, and thus friendship?
After trying to adjust for the above effects, it also does seem to me that any residual difference in quality could have to do with the group that was selected. Luke did mention to me that they tried to choose a relatively extroverted set of people for the first mini-camp. Also, the level of professional success at the mini-camp was higher than most other groups I’ve been in, including meetups. (I also think the median age of the mini-camp must have been higher than the median ages of the meetups I’ve attended. At 21, I was one of the youngest there.)
So it’s more about the form of the conversations, and less about the content?
A problem I have with in-person group conversations is that I’d occasionally find that whoever is speaking is rambling or just not being as interesting as I hope, and wish there was some way to politely signal the person to make their point quickly and give someone else a turn. And then when I get a chance to speak, I’d fear that I’m not being as interesting as I had expected to be when I decided to speak up, and other people are thinking that I should stop talking.
I’m curious if other people have had this problem and how they dealt with it.
An experiment I tried once, when I was helping mediate a 60-person round-robin discussion group (1), was to give everyone in the room four colored index cards: red, blue, green, and white, and assign them meanings by convention:
red = “I disagree with what the speaker is saying”
green = “I agree with what the speaker is saying”
blue = “I have a question about what the speaker is saying”
white = “I do not care about what the speaker is saying”
My theory was that by establishing a communication channel that supported multiple simultaneous inputs, I could get the flow control to be a lot more efficient.
The experiment mostly failed, in that people didn’t use the cards, so I can’t really speak to results. It still seems plausible to me, and I haven’t seen it done elsewhere.
===
1 - Don’t try this at home.
I think people already do something like this, using facial expressions and body language. Using your cards probably felt redundant, condescending (implying the speaker can’t read the standard signals), weird, or too explicit (e.g., when you want to signal disagreement/disinterest but also want plausible deniability).
So I guess I was hoping for some tips on how to read/send the usual signals, and what to do when someone rambles on despite sending the usual signals. Another idea I just thought of is to have a smartphone app that allows one to send a covert anonymous signal to the speaker (but it would probably take too much work to get everyone to set it up and use it).
Certainly. Those mechanisms weren’t working terribly reliably in a conversation that involved 60 people, which is precisely why I’d been looking for ways to augment the normal mechanisms.
Basically. But I think the form of the conversations leads to much better content, and more depth of exploration, and clearer / faster communication.
I honestly find that this is difficult. I think it’s easier to learn how to politely interrupt, or just be careful about the groups one hangs out in, or speak in smaller groups.
That is interesting. I try to keep my points short, when possible. I think short points also facilitates better communication; shorter back-and-forth periods enable people to ask for the specific information they need, and closes inferential gaps.
As an attendee, my personal data might be relevant:
I have gained practice deliberately acquiring new habits and soliciting useful feedback. Before camp I had no specific plans for self-improvement other than “work harder”, and now I actually keep track of what works and what doesn’t. For instance, I am deliberately improving my public speaking skills by giving talks on Minicamp material once a week to a limited audience. I would place a bet that the “alternate universe me” who instead attended Inspirational Retreat X (IRX) would not have had lasting effects nearly a year later.
I am deliberately extending my network of contacts. Speaking to new people was a skill that I didn’t have pre-Minicamp. On this point, “alternate universe me” could have reasonably acquired similar skills from IRX, but I have relatively strong reason to believe that those skills would be much more of a black box than they are now. I usually leave workshops inspired, but I can tell if it’s a poor workshop when I try to apply the skills I learned and discover that it’s not as easy as it seemed to be according to the instructor’s examples. There is a difference between “explaining something so that it sounds good” and “explaining something so someone else can do it”. I attend swing dancing workshops about once a month, and Minicamp never once felt inapplicable like several of the lessons I’ve taken over the years. More personal data: I talked a local CEO into letting me give a presentation on rationality to the class he teaches on the side at Penn State, which is something I would have never even thought about doing before Minicamp.
This comment has already gone on too long, but I hope that gives you some useful information.
Summary: Minicamp’s general perspective on teaching skills is more effective than the vast majority of small workshops I attend because the instructors taught skills rather than inspiration. Inspiration came from trying their skills and discovering that they worked, which is surprisingly rare.
People reporting back from a Christian retreat are likely to report effects that Christians approve of—that they’re asking Jesus to help them decide in their daily life, that they feel a more full and whole relationship with God, etc. But those things (where they don’t require the existence of a God) are likely to be true—they really are doing those things.
If you went to a Jehovah’s Witness retreat, and were in an accident, and you were conscious enough to refuse a blood transfusion, you’d be glad for having learned what you did at the retreat, even if you knew the refusal would be fatal.
In general, anything that is compelling and affects your decisions will make you glad for it, and its being compelling is probably not inversely related to its being true. So I’m not too concerned that my tentative answer to this question is “no.”
I’m concerned, however, that the camp can’t produce evidence of the kind, “Before the minicamp, Mary Sue was in rehab for crack. A year later, she’s clean and has a successful web consultancy.” (Exaggerating the expected magnitude of change, of course.) Religious retreats don’t produce this, and tend to produce results more like, “Immediately after the retreat I felt really good, and a year later I do awesome on unobservable metrics!”
Before the bootcamp, I’d just barely managed to graduate college and didn’t have the greatest prospects for finding a job. (Though to be fair, I was moving to SF and it was a CS degree.)
At the bootcamp, I founded (and then folded) a startup with other bootcampers, which was profoundly educational and cost a couple months of time and <$100.
Now, <1 year after the bootcamp, I’m doing programming and design work on the new SimCity, which is as close to a dream job for me as could reasonably be expected to exist.
I can’t attribute all my recent success to the bootcamp, because I was pretty awesome beforehand, but it really did dramatically improve my effectiveness in a number of domains (my girlfriend is grateful for the fashion tips I picked up, for example). Other specific things I’ve found useful include meditation, value of information calculations, and rejection therapy.
Replace “glad I went” with a better criterion- that question deserves a good response.
“Is there evidence this will be worthwhile according to my values now, independently of how it might change my values?”
“Is there evidence that this is instrumentally useful for more than warm fuzzies?”
“Is there evidence that for the probable benefit of this event the costs are substantially optimized for it? I.e., if the benefit is substantially social, even if this would be worth flying around the world for, a program could actually be optimized for social benefits, and/or I could attend a closer/cheaper/shorter program with similar benefits to me.”
“Regardless of anyone’s intent, what is this program optimized for?”
“How’s the food?”
Why our kind can’t cooperate.
The cooperation has actually been happening; it’s just that it was achieved by ostracizing the guy who asked if you were adhering to the principles expected of that kind.
Note that your original comment has positive and rising karma at this point. I have a high estimation of the minicamps (partially because I’m friends with fiddlemath, who really has appeared to level up since last summer in noticeable ways), but I’m glad that you’re stubborn about making SIAI/CMR show you some good evidence.
There are ways of making that point without saying it sounds like a “Christian brainwashing retreat.”
Sorry, “Christian retreat” didn’t convey the idea, and in any case I gave a link to a better explanation of the part of conceptspace I was trying to refer to. I’ll take it out since the link should suffice.
Thanks for your gracious apology. :)
If you don’t care whether the cooperation is doing useful work, then sure. Otherwise, criticism seems to be a necessary evil.
For the purpose of causal inference / intervention evaluation, you must ask if a Christian retreat would have had this effect on those participants. Perhaps Christians feel closer after a Christian event, but I find Christian events somewhat alienating because I’m not Christian. I don’t find aspiring rationalist events alienating, in part because I’m an aspiring rationalist. It’s fun to hang out with people who have common interests, and depending on who you are, that group is a different group… for me, it’s rationalists. Part of the point of the camp is that it has a similar bonding effect that any coming together of people with a deep common interest or aspiration can have, and in this case, the common aspiration is rationality.
Plus, at the camp, I did internalize skills and attitudes that have helped me a lot over the past (I.e. I’ve improved much more over the past year than I have in previous years), for example, looking more vigilantly for fungibility between my time and money, looking more at the reasons I do things and finding more effect ways to pursue those reasons...
Those particular effects I wouldn’t expect from a Christian camp, just as the particular effect of feeling close to Jesus is not an effect I’d expect from a rationality camp. I just happen to prefer the “rationality” effects, and these camps are for people with similar such preferences.
Seriously, it’s fun :)
If the primary motivation for attending is the emotional rewards of meeting others with interest in rationality and feeling that you’ve learned how to be more rational, then yes, a Christian brainwashing retreat would make you glad you attended it in the same way, if and only if you are/became Christian (since non Christians likely wouldn’t enjoy a Christian brainwashing retreat.)
That said, as many of us have little/no data on changes in rationality (if any) of attendees, attending is the only real option you have to test whether it might. Confirmation bias would make a positive result weak evidence, but it’d be relatively important given the lack of other evidence. Luckily even if the retreat doesn’t have benefits to your objective level of rationality it sounds worthwhile on the undisputed emotional merits.
I think what SilasBarta is trying to ask is do we have any objective measurements yet from the previous minicamp that add weight to the hypothesis that this camp does in fact improve rationality or life achievement over either the short or long term?
If not then I’m still curious, are there any plans to attempt to study rationality of attendees and non-attendees to establish such evidence?
Yes, that’s an oft-repeated goal, and as Eliezer mentions in a sibling, there’s a one-year follow-up planned but it has not yet been a year.
Right, it’s been nearly a year since the last one. The long-term evidence is out there. How are attendees doing in their lives now vs how they were doing before?
I’m pretty sure there’s been enough time to find this information out by now.
It’s hard to get objective evidence on this, because the participants were all pretty exceptional people to start off with, and there were so few of them, but there is an effort underway to collect what data we can from those that attended the longer Boot Camp—hopefully we’ll be able to report back within a month.
You’ll be able to explain the math behind what you do.
It’s easy to imagine a Christian brainwashing retreat run by someone similar to Luke that would also have that property.
Do you think a religious event would have the same effect on the same people? That is, these mostly atheist people who were all very interested in science and rationality? Or do you just think that there exist people on which a religious event would a similar effect?
This is an important distinction for someone deciding whether to attend, because such a person knows whether she is religious or not.
I’m not sure you answered the question. I think SilasBarta is looking for evidence, that someone can provide for him right now, that he will be glad he went.
ETA: and for purposes of this discussion, I think a bald assertion does not fall in the cluster “evidence”.
There’s lots of statistical data already in the post about evidence that you will be glad you went. That wasn’t what Silas Barta asked, and frankly I’m not sure this thread is going to be productive given the way the opening question was framed.
What would be a better framing?
How long would it take Anna to email the attendees and ask them to reply back about their current life status as compare to a year ago, so as to avoid proceeding with misleading evidence?
Edit in reply to the unmarked update to EY’s comment:
Like thomblake noted, the evidence being cited is not of the kind I asked for, which was why I framed my question with a link to Yvain’s lucid explanation of the problems of sorting out good retreats from bad. The exact evidence produced can be found just the same from (less useful) Christian brainwashing retreats, which is why I wasn’t impressed the last time around.
I do appreciate your efforts to apply the methods of rationality to your own endeavors.
Apparently the one-year followup is currently underway—a Minicamp attendee volunteered to do it.
This is pretty strong evidence of itself—people almost never volunteer for things and do them.
EDIT: OOPS! Anna said that an RBC attendee volunteered to do the RBC followup. RBC as you know was less successful than Minicamp, and we do not have someone doing the Minicamp followup yet.
I will remark that it’s more time-intensive than you seem to think—this is something that gets done after successfully hiring an executive assistant; we have a candidate but the hiring hasn’t yet occurred.
It would be strong evidence if the volunteer had completed the “do them” part, certainly.
Fair enough. One cannot update on evidence one has not yet received.
SilasBarta:
Eliezer_Yudkowsky:
Yes, people usually don’t do that. On the other hand, it isn’t implausible that someone who just returned from a “Christian retreat” and who is “on fire for God” to “volunteer for things and do them”. SilasBarta isn’t merely asking for evidence that the camp provides benefits; he is asking for a reason to think it has benefits that exceed those that can be obtained at other kinds of events (specifically, a “Christian retreat”).
Or, rather, that exceeds that that can be so obtained. That is, SB’s7b relates to the relative quality of the reason for belief, not the relative quality of the benefits.
But you’re right that (for example) Christian retreats routinely get people to volunteer to do things and do them, so the simple fact of a Minicamp attendee doing so is not by itself strong evidence of a difference between the two events.
OTOH, there may well be sufficient differences between the two communities that the similarity of results is such evidence. That is, if event X1 gets result Y1 from a member of community Z1, while X2 gets Y2 from a member of Z2, the similarity of Y1 and Y2 given significant relevant differences between Z1 and Z2 suggests equally significant differences between X1 and X2. If Z2 is consistently more inclined to cooperate than Z1, and Y1/Y2 demonstrate willing cooperation, I conclude that X1 is more effective at inducing cooperation than X2.
(OTOOH, a lot depends on why Z2 cooperates more reliably. If it turns out that cooperation is primarily caused by the quality of Z2′s events, then that’s evidence against there being a significant difference between X1 and X2.)
Yes, after I said “After Minicamp you will be able to explain the math behind what you do”, thus answering the original question, whereupon I was directed to answer other questions instead.
Assuming this is true, do you have a good model for why this was the case?
It is certainly the case that those at RBC were exposed to more of the “rationality teaching” material than those at Minicamp, so if this is true then it should probably be worrying.
I think it was a selection effect: at Mini-Camp, the people who went were chosen from a broad pool, of anyone who could take a week off at the beginning of summer. But the only people who went to Mega-Camp were the types of people who could afford to take a whole summer off. So the Mega-Camp attendees were younger, more likely to be a student, less likely to have other things going on in their lives.
(I was a Mega-Camp attendee.)
Other potential reasons: it started to suck living & eating together in a tight, crowded space. It’s tolerable (and even fun!) for a week, but after a few weeks, privacy & space become an issue.
These are all good reasons why RBC would seem less awesome than Mini-Camp, but they aren’t actually good reasons why it should have been less effective at teaching people rationality. If anything, surely one would expect people who were younger and had less going on in their lives to benefit more from rationality training.
Basically, I agree with you that these are the reasons that Eliezer describes RBC as “less of a success”, but this just means that Silas is right, and the measure of “success” being used is “how awesome did everyone think it was”, not “how much rationality did we manage to teach”.
Agreed.
Yes, we have a good model for why this is the case, and it involves specific managers performing worse than others so the social cost of explaining our model is not zero.
We tried two experiments. The first one worked better, so we’re repeating that one instead of the second one.
I’m open to the planning fallacy as anyone, but really, how long does it take to email everyone for their current life status?
Try it and see :)
I agree. But Silas is just doing his due diligence to ask that sort of question every time one of these things is mentioned, and surely that’s valuable to have around.
I left out the clause “that a Christian brainwashing retreat could not produce just as easily” in my retelling, since I was just noting an additional constraint. I don’t think the sort of evidence in the post above actually satisfies that criterion.