That honestly makes me calm down a bit, and I’m sorry I overreacted. Nothing about that is in the blogpost, grant, or title, though. The opening line is Tyler Cowen announcing ”...initiation of a new, special tranche of the Emergent Ventures fund to identify and foster artificial intelligence researchers...”, which sounds like the exact opposite of the wanted thing. I certainly support people applying for AI safety research if they can get it, especially from an open ended announcement. I hope it’s what people end up doing, but you can understand my concern.
Schmidt+Tyler is going to give out grants whether I crosspost it to /r/EA & LW or not. He doesn’t restrict it to safety like he should (he’s more convinced these days of AI risk than he used to be but still not nearly as much as us), but he doesn’t rule it out either, or even encourage capability rather than safety. So, if EA/LWers apply for safety grants, that takes away capability grants and funds safety grants. This seemed sufficiently self-evident to me to not need explanation.
That honestly makes me calm down a bit, and I’m sorry I overreacted. Nothing about that is in the blogpost, grant, or title, though. The opening line is Tyler Cowen announcing ”...initiation of a new, special tranche of the Emergent Ventures fund to identify and foster artificial intelligence researchers...”, which sounds like the exact opposite of the wanted thing. I certainly support people applying for AI safety research if they can get it, especially from an open ended announcement. I hope it’s what people end up doing, but you can understand my concern.
Schmidt+Tyler is going to give out grants whether I crosspost it to /r/EA & LW or not. He doesn’t restrict it to safety like he should (he’s more convinced these days of AI risk than he used to be but still not nearly as much as us), but he doesn’t rule it out either, or even encourage capability rather than safety. So, if EA/LWers apply for safety grants, that takes away capability grants and funds safety grants. This seemed sufficiently self-evident to me to not need explanation.
You’re right. I apologize.