I don’t have a specific decision that I want OpenAI to make right now (well I do, but I don’t think they’d become closedAI).
Does “closedAI” mean “OpenAI shuts down” or “OpenAI stops making their models available to the public” (I’m not sure how they could do this one while earning money) or “OpenAI stops publishing papers describing their model architectures, training datasets, hyperparameters, etc” or something else?
For the audience: one of the first “successes” of convincing a high-profile person of the importance of AI X-risk was Elon Musk.
Seems cherrypicked. Does Dustin Moskovitz provide a point in the other direction? I don’t know the stories of how these people started taking AI risk seriously and would like more details. Also, Elon became “convinced” (although it doesn’t seem like he’s convinced in the same way Alignment Forum users are) as early as 2014, and the evidence for AI x-risk looks a lot different today in 2023.
I am more excited about interventions that focus on AGI labs. They are already barreling towards AGI, and it seems like them slowing down or coordinating with each other could be really useful.
Seems plausible to me that within the next few years some other AI companies could overtake OpenAI + DeepMind in the race to AGI. What reasons are there to expect current leaders to maintain their lead?
Does “closedAI” mean “OpenAI shuts down” or “OpenAI stops making their models available to the public” (I’m not sure how they could do this one while earning money) or “OpenAI stops publishing papers describing their model architectures, training datasets, hyperparameters, etc” or something else?
Seems cherrypicked. Does Dustin Moskovitz provide a point in the other direction? I don’t know the stories of how these people started taking AI risk seriously and would like more details. Also, Elon became “convinced” (although it doesn’t seem like he’s convinced in the same way Alignment Forum users are) as early as 2014, and the evidence for AI x-risk looks a lot different today in 2023.
Seems plausible to me that within the next few years some other AI companies could overtake OpenAI + DeepMind in the race to AGI. What reasons are there to expect current leaders to maintain their lead?