It’s been almost a year since I attended. The minicamp has greatly improved me along several dimensions.
I now dress better and have used techniques provided at minicamp to become more relaxed in social situations. I’m more aware of how I’m expressing my body language. It’s not perfect control and I’ve not magically become an extrovert, but I’m better able to interact in random social situations successfully. Concretely: I’m able to sit and stand around people I don’t know and feel and present myself as relaxed. I dress better and people have noticed and I’ve received multiple comments to that effect. I’ve chosen particular ways to present myself and now I get comments like ‘you must play the guitar’ (this has happened five times since minicamp haha). This is good since it loads the initial assumptions I want the person to load.
I’ve intentionally hacked my affectation towards various things to better reach my goals. For years I never wanted to have children. My wife said (earlier this year, after minicamp) that she wanted to have kids. I was surprised and realized that given various beliefs (love for wife, more kids good for society, etc) I needed to bring my emotions and affectations in line with those goals. I did this by maximizing positive exposure to kids and focusing on the good experiences...and it worked. I’m sure nature helped, but I came to a change of emotional reaction that feels very stable. TMI: I had my vasectomy reversed and am actively working on building kid version 1.0
Minicamp helped me develop a better mental language for reasoning around rationalist principles. I’ve got tools for establishing mental breakpoints (recognizing states of surprise, rationalization, etc) and a sense for how to improve on weak areas in my reasoning. I have a LOT of things I still need to improve. Many of my actions still don’t match my beliefs. The up side is that I’m aware of many of the gaps and can make progress toward solving them. There seems to be only so much I can change at once, so I’ve been prioritizing everything out.
I’ve used the more concise, direct reasoning around rationality at my job at Valve Software. I use it to help make better decisions, concretely: when making decisions around features to add to DOTA 2 I’ve worked particularly hard at quickly relinquishing failed ideas that I generated. I have developed litanies like ‘my ideas are a product, not a component of my identity.’ Before I enter into interactions I pause and think ’what is my goal for this interaction? The reasoning tools from minicamp have helped me better teach and interpret the values of my company (which are very similar). I helped write a new employee guide that captures Valve values, but uses tools such as Anna Salamon’s “Litany for Simplified Bayes” to cut straight to the core concepts. “If X is true, what would the world look like?” “If X is not true, what would the world look like?” “What does the world look like?” I’ve been influential in instituting predictions meetings before we launch new features.
I’ve been better able to manage my time, because I’m more aware of the biases and pitfalls that lie before me. I think more about what ‘BrandonReinhart2020’ wants than what the current me wants. (Or at least, my best guess as to what I think he would want...like not being dead, and being a bad ass guitar shredder, etc). This has manifested itself concretely in my self-education around the guitar. When I went to minicamp I had only just started learning guitar. Since then I’ve practiced 415 hours (I work full time, so this is all in my spare time) and have developed entirely new skills. I can improv, write songs, etc. Minicamp provided some inspiration, yes, but there were also real tools that I’ve employed. A big one was coming home and doing research on human learning and practice. This helped me realize that my goals were achievable. Luke gave sessions on how to do efficient research. Critch gave a session on hacking your affectations. I used this to make practice something I really, really like doing (I listened to music I liked before practicing, I would put objects like role-playing books or miniatures that I liked around my practice area—nerdy yes, but it worked for me—and I would drink a frosty beer after practicing three hours in a row. Okay so that last one shows that my health beliefs and goals may not be entirely in line, but it served an objective here). Now I can easily practice for 3 hours and enjoy every moment of it. (This is important, before I would use that time for World of Warcraft and other pursuits that just wasted time and didn’t improve me.)
I’ve been in the Less Wrong orbit for a long time and have had the goal of improving my rationality for a long time. I’ve read Yudkowsky’s writing since the old SL4 days. I followed Overcoming Bias from the beginning. I can’t say that I had a really good grasp on which concepts were the most important until after minicamp. There’s huge value in being able to ask questions, debate a point, and just clarify your confusion quickly.
I have also been an SIAI skeptic. Both myself and John Salvatier thought that SIAI might be a little religion-like. Our mistake. The minicamp was a meeting of really smart people who wanted to help each other win more. The minicamp was genuinely about mental and social development and the mastery of concepts that seem to lead to a better ability to navigate complex decision trees toward desired outcomes.
While we did talk about existential risk, the SIAI never went deep into high shock level concepts that might alienate attendees. It wasn’t an SIAI funding press. It wasn’t a AGI press. In fact, I thought they almost went too light on this subject (but I came to modern rationality from trans/posthumanism and most people in the future will probably get to trans/posthumanism from modern rationality, so discussions about AGI and such feels normal to me). Point being if you have concerns about this you’ll feel a lot better as you attend.
I would say the thing that most discomforted me during the event was the attitude toward meditation. I realized, though, that this was an indicator about my preconceptions about meditation and not necessarily due to facts about meditation. After talking to several people about meditation, I learned that there wasn’t any funky mysticism inherent to meditation, just closely associated to meditation. Some people are trying to figure out if it can be used as a tool and are trying to figure out ways to experiment around it, etc. I updated away from ‘meditation is a scary religious thing’ toward ‘meditation might be another trick to the bag.’ I decided to let other people bear the burden/risk of doing the research there, though. :)
Some other belief shifts related to minicamp: I have greatly updated toward the Less Wrong style rationality process as being legitimate tools for making better decisions. I have updated a great deal toward the SIAI being a net good for humanity. I have updated a great deal toward the SIAI being led by the right group of people (after personal interactions with Luke, Anna, and Eliezer).
Comparing minicamp to a religious retreat seems odd to me. There is something exciting about spending time with a bunch of very smart people, but it’s more like the kind of experience you’d have at a domain-specific research summit. The experience isn’t to manipulate through repeated and intense appeals to emotion, guilt, etc (I was a Wesleyan Christian when I was younger and went to retreats like Emaeus and I still remember them pressing a nail sharply into my palm as I went to the altar to pray for forgiveness). It’s more accurate to think of minicamp as a rationality summit, with the instructors presenting findings, sharing techniques for the replication of those findings, and there being an ongoing open discussion of the findings and the process used to generate findings. And like any good Summit there are parties.
If you’re still in doubt, go anyway. I put the probability of self-damage due to attending minicamp at extremely low, compared to self-damage from attending your standard college level economics lecture or a managerial business skills improvement workshop. It doesn’t even blip on a radar calibrated to the kind of self-damage you could do speculatively attending religious retreats.
If you’re a game developer, you would probably improve your ability to make good decisions around products more by attending SIAI Minicamp than you would by attending GDC (of course, GDC is still valuable for building a social network within the industry).
If you’re still in doubt, go anyway. I put the probability of self-damage due to attending minicamp at extremely low, compared to self-damage from attending your standard college level economics lecture or a managerial business skills improvement workshop. It doesn’t even blip on a radar calibrated to the kind of self-damage you could do speculatively attending religious retreats.
What about the cost? I would not call spending $1500 in a week insignificant. And as a baseline, I believe that being surrounded for a week by a group of people who believe strongly in some collection of ideas is a risk at least an order of magnitude higher than an economics lecture. I certainly expect that it would have a much stronger effect on me (as it seems it has had on you) than the lecture would, and I would most certainly not take a risk of this magnitude if I have any non-negligible doubts.
To address your second point first, the -attendees- were not a group who strongly shared common beliefs. Some attended due to lots of prior exposure to LW, a very small number were strong x-risk types, several were there only because of recent exposure to things like Harry Potter and were curious, many were strongly skeptical of x-risks. There were no discussions that struck me as cheering for the team—and I was actively looking for them!
Some counter evidence, though: there was definitely a higher occurrence of cryonicists and people interested in cryonics than you’d find in any random sample of 30 people. I.e.: some amount >2 vs some amount close to 0. So we weren’t a wildly heterogeneous group.
As for the instructors—Anna and Luke were both very open about the fact that the rationality-education process is in its infancy and among the various SIAI members there is discussion about how to proceed. I could be wrong, I interpreted Eliezer as being somewhat skeptical of the minicamp process. When he visited, he said he had almost no involvement related to the minicamp. I believe he said he was mainly a sounding board for some of the ideas. I’m interpreting his involvement in this thread now and related threads/topics as a belief shift on his part toward the minicamp being valuable.
I think your order of magnitude increases well describes a bad conceivable scenario, but poorly describes the scenario I actually witnessed.
Now, for cost, I don’t know. I’m attending a guitar camp in August that will be 7 days and cost me $2000. I would put the value of minicamp a fair amount above the value of the guitar camp, but I wouldn’t necessarily pay $3000 to attend minicamp. To answer the price question I would ask:
1) What else do I plan to spend the $1500 on? What plans or goals suffer setbacks? What would I otherwise buy?
2) What do I value the information from attending at? I can see how it would be easier to measure the value of information from a guitar camp versus one about something that feels more abstract. So maybe the first step is to find the concrete value you’ve already gotten out of LW. If you’ve read the sequences and you think there are useful tools there, you might start with ’what would be the estimated value from being able to clarify the things I’m unsure about.” So you take some measurement of value you’ve already gotten from LW and do some back of the napkin math with that.
3) Consider your level of risk aversion versus the value of minicamp now vs later. If these new minicamps are successful, more people will post about them. Attendees will validate or negate past attendee experiences. It may be that if $1500 is too much for you when measured against your estimation of the pay-off discounted by risks, that you simply wait. Either the camps will be shown to be valuable or they will be shown to be low value.
4) Consider some of the broad possible future worlds that follow from attending minicamp. In A you attend and things go great, you come out with new rationality tools. In B you attend and your reaction is neutral and you don’t gain anything useful. In C you attend and have poor experiences or worse suffer some kind of self-damage (ex: your beliefs shift in measurably harmful ways that your prior self would have not agreed to submit to ahead of time). Most attendees are suggesting you’ll find yourself in worlds like A. We could be lying because we all exist in worlds like C or we’re in B but feel an obligation to justify attending the camp or whatever. Weigh your estimate of our veracity with your risk aversion. Update the connected values.
I would suggest it unlikely that the SIAI be so skilled at manipulation that they’ve succeeded in subverting an entire group of people from diverse backgrounds and with some predisposition to be skeptical. Look for evidence that some people exist in B or C (probably from direct posts stating as much—people would probably want to prevent other people from being harmed).
There are other things to put into a set of considerations around whether to spend the money, but these are some.
I just wanted to say this (esp. the second part) is actually one of the most cogent posts about anything that I’ve read in quite some time, and as such, a self-referential example of the value of the camp. It should probably be more visible, and I recommend making it a discussion post about deciding whether/when to attend.
After talking to several people about meditation, I learned that there wasn’t any funky mysticism inherent to meditation, just closely associated to meditation. Some people are trying to figure out if it can be used as a tool and are trying to figure out ways to experiment around it, etc.
Rather off-topic, but I’m very interested in rational meditation-advice: Did they suggest specific techniques of meditation like e.g. vipassana or did they recommend some particular books on meditation?
Jasen Murray suggested specific techniques and specific resources, which I unfortunately cannot remember (I was not that interested in that part of RBC).
I attended the 2011 minicamp.
It’s been almost a year since I attended. The minicamp has greatly improved me along several dimensions.
I now dress better and have used techniques provided at minicamp to become more relaxed in social situations. I’m more aware of how I’m expressing my body language. It’s not perfect control and I’ve not magically become an extrovert, but I’m better able to interact in random social situations successfully. Concretely: I’m able to sit and stand around people I don’t know and feel and present myself as relaxed. I dress better and people have noticed and I’ve received multiple comments to that effect. I’ve chosen particular ways to present myself and now I get comments like ‘you must play the guitar’ (this has happened five times since minicamp haha). This is good since it loads the initial assumptions I want the person to load.
I’ve intentionally hacked my affectation towards various things to better reach my goals. For years I never wanted to have children. My wife said (earlier this year, after minicamp) that she wanted to have kids. I was surprised and realized that given various beliefs (love for wife, more kids good for society, etc) I needed to bring my emotions and affectations in line with those goals. I did this by maximizing positive exposure to kids and focusing on the good experiences...and it worked. I’m sure nature helped, but I came to a change of emotional reaction that feels very stable. TMI: I had my vasectomy reversed and am actively working on building kid version 1.0
Minicamp helped me develop a better mental language for reasoning around rationalist principles. I’ve got tools for establishing mental breakpoints (recognizing states of surprise, rationalization, etc) and a sense for how to improve on weak areas in my reasoning. I have a LOT of things I still need to improve. Many of my actions still don’t match my beliefs. The up side is that I’m aware of many of the gaps and can make progress toward solving them. There seems to be only so much I can change at once, so I’ve been prioritizing everything out.
I’ve used the more concise, direct reasoning around rationality at my job at Valve Software. I use it to help make better decisions, concretely: when making decisions around features to add to DOTA 2 I’ve worked particularly hard at quickly relinquishing failed ideas that I generated. I have developed litanies like ‘my ideas are a product, not a component of my identity.’ Before I enter into interactions I pause and think ’what is my goal for this interaction? The reasoning tools from minicamp have helped me better teach and interpret the values of my company (which are very similar). I helped write a new employee guide that captures Valve values, but uses tools such as Anna Salamon’s “Litany for Simplified Bayes” to cut straight to the core concepts. “If X is true, what would the world look like?” “If X is not true, what would the world look like?” “What does the world look like?” I’ve been influential in instituting predictions meetings before we launch new features.
I’ve been better able to manage my time, because I’m more aware of the biases and pitfalls that lie before me. I think more about what ‘BrandonReinhart2020’ wants than what the current me wants. (Or at least, my best guess as to what I think he would want...like not being dead, and being a bad ass guitar shredder, etc). This has manifested itself concretely in my self-education around the guitar. When I went to minicamp I had only just started learning guitar. Since then I’ve practiced 415 hours (I work full time, so this is all in my spare time) and have developed entirely new skills. I can improv, write songs, etc. Minicamp provided some inspiration, yes, but there were also real tools that I’ve employed. A big one was coming home and doing research on human learning and practice. This helped me realize that my goals were achievable. Luke gave sessions on how to do efficient research. Critch gave a session on hacking your affectations. I used this to make practice something I really, really like doing (I listened to music I liked before practicing, I would put objects like role-playing books or miniatures that I liked around my practice area—nerdy yes, but it worked for me—and I would drink a frosty beer after practicing three hours in a row. Okay so that last one shows that my health beliefs and goals may not be entirely in line, but it served an objective here). Now I can easily practice for 3 hours and enjoy every moment of it. (This is important, before I would use that time for World of Warcraft and other pursuits that just wasted time and didn’t improve me.)
I’ve been in the Less Wrong orbit for a long time and have had the goal of improving my rationality for a long time. I’ve read Yudkowsky’s writing since the old SL4 days. I followed Overcoming Bias from the beginning. I can’t say that I had a really good grasp on which concepts were the most important until after minicamp. There’s huge value in being able to ask questions, debate a point, and just clarify your confusion quickly.
I have also been an SIAI skeptic. Both myself and John Salvatier thought that SIAI might be a little religion-like. Our mistake. The minicamp was a meeting of really smart people who wanted to help each other win more. The minicamp was genuinely about mental and social development and the mastery of concepts that seem to lead to a better ability to navigate complex decision trees toward desired outcomes.
While we did talk about existential risk, the SIAI never went deep into high shock level concepts that might alienate attendees. It wasn’t an SIAI funding press. It wasn’t a AGI press. In fact, I thought they almost went too light on this subject (but I came to modern rationality from trans/posthumanism and most people in the future will probably get to trans/posthumanism from modern rationality, so discussions about AGI and such feels normal to me). Point being if you have concerns about this you’ll feel a lot better as you attend.
I would say the thing that most discomforted me during the event was the attitude toward meditation. I realized, though, that this was an indicator about my preconceptions about meditation and not necessarily due to facts about meditation. After talking to several people about meditation, I learned that there wasn’t any funky mysticism inherent to meditation, just closely associated to meditation. Some people are trying to figure out if it can be used as a tool and are trying to figure out ways to experiment around it, etc. I updated away from ‘meditation is a scary religious thing’ toward ‘meditation might be another trick to the bag.’ I decided to let other people bear the burden/risk of doing the research there, though. :)
Some other belief shifts related to minicamp: I have greatly updated toward the Less Wrong style rationality process as being legitimate tools for making better decisions. I have updated a great deal toward the SIAI being a net good for humanity. I have updated a great deal toward the SIAI being led by the right group of people (after personal interactions with Luke, Anna, and Eliezer).
Comparing minicamp to a religious retreat seems odd to me. There is something exciting about spending time with a bunch of very smart people, but it’s more like the kind of experience you’d have at a domain-specific research summit. The experience isn’t to manipulate through repeated and intense appeals to emotion, guilt, etc (I was a Wesleyan Christian when I was younger and went to retreats like Emaeus and I still remember them pressing a nail sharply into my palm as I went to the altar to pray for forgiveness). It’s more accurate to think of minicamp as a rationality summit, with the instructors presenting findings, sharing techniques for the replication of those findings, and there being an ongoing open discussion of the findings and the process used to generate findings. And like any good Summit there are parties.
If you’re still in doubt, go anyway. I put the probability of self-damage due to attending minicamp at extremely low, compared to self-damage from attending your standard college level economics lecture or a managerial business skills improvement workshop. It doesn’t even blip on a radar calibrated to the kind of self-damage you could do speculatively attending religious retreats.
If you’re a game developer, you would probably improve your ability to make good decisions around products more by attending SIAI Minicamp than you would by attending GDC (of course, GDC is still valuable for building a social network within the industry).
What about the cost? I would not call spending $1500 in a week insignificant. And as a baseline, I believe that being surrounded for a week by a group of people who believe strongly in some collection of ideas is a risk at least an order of magnitude higher than an economics lecture. I certainly expect that it would have a much stronger effect on me (as it seems it has had on you) than the lecture would, and I would most certainly not take a risk of this magnitude if I have any non-negligible doubts.
To address your second point first, the -attendees- were not a group who strongly shared common beliefs. Some attended due to lots of prior exposure to LW, a very small number were strong x-risk types, several were there only because of recent exposure to things like Harry Potter and were curious, many were strongly skeptical of x-risks. There were no discussions that struck me as cheering for the team—and I was actively looking for them!
Some counter evidence, though: there was definitely a higher occurrence of cryonicists and people interested in cryonics than you’d find in any random sample of 30 people. I.e.: some amount >2 vs some amount close to 0. So we weren’t a wildly heterogeneous group.
As for the instructors—Anna and Luke were both very open about the fact that the rationality-education process is in its infancy and among the various SIAI members there is discussion about how to proceed. I could be wrong, I interpreted Eliezer as being somewhat skeptical of the minicamp process. When he visited, he said he had almost no involvement related to the minicamp. I believe he said he was mainly a sounding board for some of the ideas. I’m interpreting his involvement in this thread now and related threads/topics as a belief shift on his part toward the minicamp being valuable.
I think your order of magnitude increases well describes a bad conceivable scenario, but poorly describes the scenario I actually witnessed.
Now, for cost, I don’t know. I’m attending a guitar camp in August that will be 7 days and cost me $2000. I would put the value of minicamp a fair amount above the value of the guitar camp, but I wouldn’t necessarily pay $3000 to attend minicamp. To answer the price question I would ask:
1) What else do I plan to spend the $1500 on? What plans or goals suffer setbacks? What would I otherwise buy?
2) What do I value the information from attending at? I can see how it would be easier to measure the value of information from a guitar camp versus one about something that feels more abstract. So maybe the first step is to find the concrete value you’ve already gotten out of LW. If you’ve read the sequences and you think there are useful tools there, you might start with ’what would be the estimated value from being able to clarify the things I’m unsure about.” So you take some measurement of value you’ve already gotten from LW and do some back of the napkin math with that.
3) Consider your level of risk aversion versus the value of minicamp now vs later. If these new minicamps are successful, more people will post about them. Attendees will validate or negate past attendee experiences. It may be that if $1500 is too much for you when measured against your estimation of the pay-off discounted by risks, that you simply wait. Either the camps will be shown to be valuable or they will be shown to be low value.
4) Consider some of the broad possible future worlds that follow from attending minicamp. In A you attend and things go great, you come out with new rationality tools. In B you attend and your reaction is neutral and you don’t gain anything useful. In C you attend and have poor experiences or worse suffer some kind of self-damage (ex: your beliefs shift in measurably harmful ways that your prior self would have not agreed to submit to ahead of time). Most attendees are suggesting you’ll find yourself in worlds like A. We could be lying because we all exist in worlds like C or we’re in B but feel an obligation to justify attending the camp or whatever. Weigh your estimate of our veracity with your risk aversion. Update the connected values.
I would suggest it unlikely that the SIAI be so skilled at manipulation that they’ve succeeded in subverting an entire group of people from diverse backgrounds and with some predisposition to be skeptical. Look for evidence that some people exist in B or C (probably from direct posts stating as much—people would probably want to prevent other people from being harmed).
There are other things to put into a set of considerations around whether to spend the money, but these are some.
I just wanted to say this (esp. the second part) is actually one of the most cogent posts about anything that I’ve read in quite some time, and as such, a self-referential example of the value of the camp. It should probably be more visible, and I recommend making it a discussion post about deciding whether/when to attend.
Nitpick—cRYonics. Thanks!
Doh, I have no idea why my hands type c-y-r instead of c-r-y, thanks.
You’re not alone—it’s a common mistyping!
Rather off-topic, but I’m very interested in rational meditation-advice: Did they suggest specific techniques of meditation like e.g. vipassana or did they recommend some particular books on meditation?
Jasen Murray suggested specific techniques and specific resources, which I unfortunately cannot remember (I was not that interested in that part of RBC).
Thanks for that. It’s fascinating to get a glimpse of what rationality looks like in the real world rather than just online interchanges.
Note aside, I’m a big fan of your work. Reassures me to know rationalists are on the team for dota 2