I agree about the strong commercial incentives, but I don’t think we will be in a context where people will follow their incentives. After all, there are incredibly strong incentives not to make AGI at all until you can be very confident it is perfectly safe—strong enough that it’s probably not a good idea to pursue AI research at all until AI safety research is much more well-established than it is today—and yet here we are.
Basically, people won’t recognize their incentives, because people won’t realize how much danger they are in.
Hmm, in my model most of the x-risk is gone if there is no incentive to deploy. But I expect actors will deploy systems because their system is aligned with a proxy. At least this leads to short-term gains. Maybe the crux is that you expect these actors to suffer a large private harm (death) and I expect a small private harm (for each system, a marginal distributed harm to all of society)?
I agree about the strong commercial incentives, but I don’t think we will be in a context where people will follow their incentives. After all, there are incredibly strong incentives not to make AGI at all until you can be very confident it is perfectly safe—strong enough that it’s probably not a good idea to pursue AI research at all until AI safety research is much more well-established than it is today—and yet here we are.
Basically, people won’t recognize their incentives, because people won’t realize how much danger they are in.
Hmm, in my model most of the x-risk is gone if there is no incentive to deploy. But I expect actors will deploy systems because their system is aligned with a proxy. At least this leads to short-term gains. Maybe the crux is that you expect these actors to suffer a large private harm (death) and I expect a small private harm (for each system, a marginal distributed harm to all of society)?
It makes no difference if the marginal distributed harm to all of society is so overwhelmingly large that your share of it is still death.
I’m using the colloquial meaning of ‘marginal’ = ‘not large’.