Thankyou Eliezer. I’m fascinated by the reasoning and analysis that you’re hinting at here. It helps puts the decisions you and SIAI have made in perspective.
Could you give a ballpark estimate of how much of the importance of successful rationality spin offs is based on expectations of producing core FAI researchers versus producing FAI funding?
I’ve tried less hard to get core FAI researchers than funding. I suspect that given sufficient funding produced by magic, it would be possible to solve the core-FAI-researchers issue by finding the people and talking to them directly—but I haven’t tried it!
How much money would you need magicked to allow you to shed fundraising and infrastructure, etc, and just hire and hole up with a dream team of hyper-competent maths wonks? Restated, at which set amount would SIAI be comfortably able to aggressively pursue its long-term research?
Thankyou Eliezer. I’m fascinated by the reasoning and analysis that you’re hinting at here. It helps puts the decisions you and SIAI have made in perspective.
Could you give a ballpark estimate of how much of the importance of successful rationality spin offs is based on expectations of producing core FAI researchers versus producing FAI funding?
I’ve tried less hard to get core FAI researchers than funding. I suspect that given sufficient funding produced by magic, it would be possible to solve the core-FAI-researchers issue by finding the people and talking to them directly—but I haven’t tried it!
How much money would you need magicked to allow you to shed fundraising and infrastructure, etc, and just hire and hole up with a dream team of hyper-competent maths wonks? Restated, at which set amount would SIAI be comfortably able to aggressively pursue its long-term research?
He once mentioned a figure of US $10 million / year. Feels like he’s made a similar remark more recently, but it didn’t show in my brief search.