I am also not impressed with the pause AI movement and am concerned about AI safety. To me focusing on AI companies and training FLOPS is not the best way to do things. Caps on data center sizes and worldwide GPU production caps would make more sense to me. Pausing software but not hardware gives more time for alignment but makes a worse hardware overhang. I don’t think thats helpful. Also they focus too much on OpenAI from what I’ve seen. xAI will soon have the largest training center for a start.
I don’t think this is right or workable https://pauseai.info/proposal—figure out how biological intelligence learns and you don’t need a large training run. There’s no guarantee at all that a pause at this stage can help align super AI. I think we need greater capabilities to know what we are dealing with. Even with a 50 year pause to study GPT4 type models I wouldn’t be confident we could learn enough from that. They have no realistic way to lift the pause, so its a desire to stop AI indefinitely.
“There will come a point where potentially superintelligent AI models can be trained for a few thousand dollars or less, perhaps even on consumer hardware. We need to be prepared for this.”
You can’t prepare for this without first having superintelligent models running on the most capable facilities then having already gone through a positive Singularity. They have no workable plan for achieving a positive Singularity, just try to stop and hope.
I am also not impressed with the pause AI movement and am concerned about AI safety. To me focusing on AI companies and training FLOPS is not the best way to do things. Caps on data center sizes and worldwide GPU production caps would make more sense to me. Pausing software but not hardware gives more time for alignment but makes a worse hardware overhang. I don’t think thats helpful. Also they focus too much on OpenAI from what I’ve seen. xAI will soon have the largest training center for a start.
I don’t think this is right or workable https://pauseai.info/proposal—figure out how biological intelligence learns and you don’t need a large training run. There’s no guarantee at all that a pause at this stage can help align super AI. I think we need greater capabilities to know what we are dealing with. Even with a 50 year pause to study GPT4 type models I wouldn’t be confident we could learn enough from that. They have no realistic way to lift the pause, so its a desire to stop AI indefinitely.
“There will come a point where potentially superintelligent AI models can be trained for a few thousand dollars or less, perhaps even on consumer hardware. We need to be prepared for this.”
You can’t prepare for this without first having superintelligent models running on the most capable facilities then having already gone through a positive Singularity. They have no workable plan for achieving a positive Singularity, just try to stop and hope.