Long run I’d prefer with something like altruistic equity / certificates of impact. But frankly I don’t think we have hard enough funding coordination problems that it’s going to be worth figuring that kind of thing out.
(And like every other community we are free-riders—I think that most of the value of experimenting with such systems would accrue to other people who can copy you if successful, and we are just too focused on helping with AI alignment to contribute to that kind of altruistic public good. If only someone would be willing to purchase the impact certificate from us if it worked out...)
Long run I’d prefer with something like altruistic equity / certificates of impact. But frankly I don’t think we have hard enough funding coordination problems that it’s going to be worth figuring that kind of thing out.
(And like every other community we are free-riders—I think that most of the value of experimenting with such systems would accrue to other people who can copy you if successful, and we are just too focused on helping with AI alignment to contribute to that kind of altruistic public good. If only someone would be willing to purchase the impact certificate from us if it worked out...)