This does sound like a good idea, and it’s a very generous invitation. Unfortunately, I am neither a woman, nor will I be traveling to Berkeley anytime soon. I do have one concern though.
The sorts of things that I am likely to find compelling are:
Caring about progress for humanity in the areas of living longer, healthier, and happier
Caring about FAI
Being agenty
Having interests in the sort of geeky things that a lot of the community is interested in (math/sciences/psychology)
The second item on this list strikes me as perhaps not very well thought out. In the last LW survey, 16.5% of respondents identified uFAI as the most likely existential risk, well below two other responses, despite the fact that it is the only one that gets a significant amount of discussion here. In fact, I don’t think I’ve seen any discussion of whether, if uFAI is not the most likely existential risk, that FAI is the risk that rationalists should be focusing primarily on. So part of the point I want to make is that I’m concerned with how much we have equated in our minds the concepts of “rationalist” and “supporter of the Singularity Institute”.
The other, more specific point I wanted to make is that it seems to me to be a bad idea to try to recruit members, for a group based around one topic (learning to think better), exclusively from a sample of people who are interested in a different topic (FAI). I would predict that there are a large number of female aspiring rationalists who don’t really get the concept of CEV, or are all that interested. If you filter for that, I think you’re going to miss out on an awful lot of people that we would like to become part of the community.
Thanks for the feedback and the opportunity to clarify. That list was meant as the sorts of things I would find intriguing/be interested in as examples, not requirements or filters. Any sort of wanting to improve the world in a way that creates value is of interest to me. And I’m not even asking for success, just some amount of desire. Just having it as part of their mental model and having an eye out for opportunities would be something I would value.
This does sound like a good idea, and it’s a very generous invitation. Unfortunately, I am neither a woman, nor will I be traveling to Berkeley anytime soon. I do have one concern though.
The second item on this list strikes me as perhaps not very well thought out. In the last LW survey, 16.5% of respondents identified uFAI as the most likely existential risk, well below two other responses, despite the fact that it is the only one that gets a significant amount of discussion here. In fact, I don’t think I’ve seen any discussion of whether, if uFAI is not the most likely existential risk, that FAI is the risk that rationalists should be focusing primarily on. So part of the point I want to make is that I’m concerned with how much we have equated in our minds the concepts of “rationalist” and “supporter of the Singularity Institute”.
The other, more specific point I wanted to make is that it seems to me to be a bad idea to try to recruit members, for a group based around one topic (learning to think better), exclusively from a sample of people who are interested in a different topic (FAI). I would predict that there are a large number of female aspiring rationalists who don’t really get the concept of CEV, or are all that interested. If you filter for that, I think you’re going to miss out on an awful lot of people that we would like to become part of the community.
Hi There,
Thanks for the feedback and the opportunity to clarify. That list was meant as the sorts of things I would find intriguing/be interested in as examples, not requirements or filters. Any sort of wanting to improve the world in a way that creates value is of interest to me. And I’m not even asking for success, just some amount of desire. Just having it as part of their mental model and having an eye out for opportunities would be something I would value.