The type of global ban envisioned by yudkowsky really only makes sense if you agree with his premises
I think Eliezer’s current attitude is actually much closer to how an ordinary person thinks or would think about the problem. Most people don’t feel a driving need to create a potential rival to the human race in the first place! It’s only those seduced by the siren call of technology, or who are trying to engage with the harsh realities of political and economic power, who think we just have to keep gambling in our current way. Any politician who seriously tried to talk about this issue would soon be trapped between public pressure to shut it all down, and private pressure to let it keep happening.
setting the bar at “more powerful than GPT-5” is a low bar that is very hard to enforce
It may be hard to enforce but what other kind of ban would be meaningful? Consider just GPT 3.5 and 4, embedded in larger systems that give them memory, reflection, and access to the real world, something which multiple groups are working on right now. It would require something unusual for that not to lead to “AGI” within a handful of years.
I think Eliezer’s current attitude is actually much closer to how an ordinary person thinks or would think about the problem. Most people don’t feel a driving need to create a potential rival to the human race in the first place! It’s only those seduced by the siren call of technology, or who are trying to engage with the harsh realities of political and economic power, who think we just have to keep gambling in our current way. Any politician who seriously tried to talk about this issue would soon be trapped between public pressure to shut it all down, and private pressure to let it keep happening.
It may be hard to enforce but what other kind of ban would be meaningful? Consider just GPT 3.5 and 4, embedded in larger systems that give them memory, reflection, and access to the real world, something which multiple groups are working on right now. It would require something unusual for that not to lead to “AGI” within a handful of years.