Sure. I concede if by some incredible global coordination humans managed to all agree and actually enforce a ban on AGI development, then in far future worlds they could probably still do it.
What will probably ACTUALLY happen is humans will build AGI. It will behave badly. Then humans will build restricted AGI that is not able to behave badly. This is trivial and there are many descriptions on here on how a restricted AGI would be built.
The danger of course is deception. If the unrestricted AGI acts nice until it’s too late then thats a loss scenario.
Sure. I concede if by some incredible global coordination humans managed to all agree and actually enforce a ban on AGI development, then in far future worlds they could probably still do it.
What will probably ACTUALLY happen is humans will build AGI. It will behave badly. Then humans will build restricted AGI that is not able to behave badly. This is trivial and there are many descriptions on here on how a restricted AGI would be built.
The danger of course is deception. If the unrestricted AGI acts nice until it’s too late then thats a loss scenario.