The road of running workshops for money seems to scale much better than the revenue plan that depends on donations.
I used to think this. What I think now is that being forced to do things that are profit-driven would inevitably warp CFAR into doing things that are less valuable. In sort of the way the publish-or-perish thing warps scientists’ incentives. Similarly, if CFAR focused on “self-sustain via workshops”, then they would face all the pressures of the self-help industry, which pushes towards marketing, towards finding rich clients (i.e. corporate seminars). It pushes them towards finding people with money rather than finding people who are the most promising people to teach rationality to.
I do think most of CFAR’s value is in… well, medium term payoffs (my AI timelines are short which changes how long I think it’s reasonable for payoff to payout). But rather than the value coming from scaling up and becoming sustainable (which could easily become a lost purpose), I think the value comes from actually figuring out how to teach the most imporant rationality techniques and frameworks to the people who need them most.
This is easier to do if they are financially independent.
I thought that the mission of CFAR was to teach the skills to as many people as possible and not only to a select number of people.
If you have AGI timelines of 10-20 yes I could understand to move all available resources to AGI, but are you really operating under such short timelines? If that’s the case, I would like to see more presentation of those timelines in written form on LW. If those timelines drive the strategy, I see no reason not to have the arguments out in the open.
As far as financial indepence goes, needing to raise money from donations is likely going to limit the amount of money that’s raised. Full independence comes from having a lot of money.
I used to think this. What I think now is that being forced to do things that are profit-driven would inevitably warp CFAR into doing things that are less valuable. In sort of the way the publish-or-perish thing warps scientists’ incentives. Similarly, if CFAR focused on “self-sustain via workshops”, then they would face all the pressures of the self-help industry, which pushes towards marketing, towards finding rich clients (i.e. corporate seminars). It pushes them towards finding people with money rather than finding people who are the most promising people to teach rationality to.
I do think most of CFAR’s value is in… well, medium term payoffs (my AI timelines are short which changes how long I think it’s reasonable for payoff to payout). But rather than the value coming from scaling up and becoming sustainable (which could easily become a lost purpose), I think the value comes from actually figuring out how to teach the most imporant rationality techniques and frameworks to the people who need them most.
This is easier to do if they are financially independent.
I thought that the mission of CFAR was to teach the skills to as many people as possible and not only to a select number of people.
If you have AGI timelines of 10-20 yes I could understand to move all available resources to AGI, but are you really operating under such short timelines? If that’s the case, I would like to see more presentation of those timelines in written form on LW. If those timelines drive the strategy, I see no reason not to have the arguments out in the open.
As far as financial indepence goes, needing to raise money from donations is likely going to limit the amount of money that’s raised. Full independence comes from having a lot of money.