I think some sort of cap is the one of the highest impact things we can do from a safety perspective. I agree that imposing the cap effectively and getting buy-in from broader society is a challenge, however, these problems are a lot more tractable than AI safety.
I haven’t heard anybody else propose this so I wanted to float it out there.
I think you make a lot of great points.
I think some sort of cap is the one of the highest impact things we can do from a safety perspective. I agree that imposing the cap effectively and getting buy-in from broader society is a challenge, however, these problems are a lot more tractable than AI safety.
I haven’t heard anybody else propose this so I wanted to float it out there.