I found reading the the post about the 2018 budget to be suprising on two fronts.
1) I wouldn’t have thought that CFAR buys a venue given what I know about CFAR, but I think it’s a very good decision, especially given the presented economics. In addition what’s already written having your own venue means that when the venue isn’t used for official CFAR purposes it can be rented cheaply to other people running events for rationalists. Especially in California where real estage is expensive I would expect that capability to be very valuable for the broader movement.
2) The decision to run fewer mainline workshops feels strange to me. I would expect that it would be good to sell as many mainline workshops as it can when it can get participants who pay $4000 for it.
When I try to put my intuitions into words, I think the biggest reason is that I believe that scaling up general rationality education is very valuable and even when the impact per participant is higher with workshops for AI Safety.
A general expectation for startups is that most of the value that gets created won’t be created next year but years down the line. The road of running workshops for money seems to scale much better than the revenue plan that depends on donations.
In the document about your impact measurement you don’t list the fact that self-reports are notorisly unreliable under limitations. I think there’s a good chance you put too much stock into the numbers.
The road of running workshops for money seems to scale much better than the revenue plan that depends on donations.
I used to think this. What I think now is that being forced to do things that are profit-driven would inevitably warp CFAR into doing things that are less valuable. In sort of the way the publish-or-perish thing warps scientists’ incentives. Similarly, if CFAR focused on “self-sustain via workshops”, then they would face all the pressures of the self-help industry, which pushes towards marketing, towards finding rich clients (i.e. corporate seminars). It pushes them towards finding people with money rather than finding people who are the most promising people to teach rationality to.
I do think most of CFAR’s value is in… well, medium term payoffs (my AI timelines are short which changes how long I think it’s reasonable for payoff to payout). But rather than the value coming from scaling up and becoming sustainable (which could easily become a lost purpose), I think the value comes from actually figuring out how to teach the most imporant rationality techniques and frameworks to the people who need them most.
This is easier to do if they are financially independent.
I thought that the mission of CFAR was to teach the skills to as many people as possible and not only to a select number of people.
If you have AGI timelines of 10-20 yes I could understand to move all available resources to AGI, but are you really operating under such short timelines? If that’s the case, I would like to see more presentation of those timelines in written form on LW. If those timelines drive the strategy, I see no reason not to have the arguments out in the open.
As far as financial indepence goes, needing to raise money from donations is likely going to limit the amount of money that’s raised. Full independence comes from having a lot of money.
I found reading the the post about the 2018 budget to be suprising on two fronts.
1) I wouldn’t have thought that CFAR buys a venue given what I know about CFAR, but I think it’s a very good decision, especially given the presented economics. In addition what’s already written having your own venue means that when the venue isn’t used for official CFAR purposes it can be rented cheaply to other people running events for rationalists. Especially in California where real estage is expensive I would expect that capability to be very valuable for the broader movement.
2) The decision to run fewer mainline workshops feels strange to me. I would expect that it would be good to sell as many mainline workshops as it can when it can get participants who pay $4000 for it.
When I try to put my intuitions into words, I think the biggest reason is that I believe that scaling up general rationality education is very valuable and even when the impact per participant is higher with workshops for AI Safety.
A general expectation for startups is that most of the value that gets created won’t be created next year but years down the line. The road of running workshops for money seems to scale much better than the revenue plan that depends on donations.
In the document about your impact measurement you don’t list the fact that self-reports are notorisly unreliable under limitations. I think there’s a good chance you put too much stock into the numbers.
I used to think this. What I think now is that being forced to do things that are profit-driven would inevitably warp CFAR into doing things that are less valuable. In sort of the way the publish-or-perish thing warps scientists’ incentives. Similarly, if CFAR focused on “self-sustain via workshops”, then they would face all the pressures of the self-help industry, which pushes towards marketing, towards finding rich clients (i.e. corporate seminars). It pushes them towards finding people with money rather than finding people who are the most promising people to teach rationality to.
I do think most of CFAR’s value is in… well, medium term payoffs (my AI timelines are short which changes how long I think it’s reasonable for payoff to payout). But rather than the value coming from scaling up and becoming sustainable (which could easily become a lost purpose), I think the value comes from actually figuring out how to teach the most imporant rationality techniques and frameworks to the people who need them most.
This is easier to do if they are financially independent.
I thought that the mission of CFAR was to teach the skills to as many people as possible and not only to a select number of people.
If you have AGI timelines of 10-20 yes I could understand to move all available resources to AGI, but are you really operating under such short timelines? If that’s the case, I would like to see more presentation of those timelines in written form on LW. If those timelines drive the strategy, I see no reason not to have the arguments out in the open.
As far as financial indepence goes, needing to raise money from donations is likely going to limit the amount of money that’s raised. Full independence comes from having a lot of money.