I’m curious what these “more effective words” are. This isn’t asked flippantly. Clearly there is a geopolitical dimension to the AI issue and Zvi lives in the U.S. Even as a rationalist, how should Zvi talk about the issue? China and the U.S. are hostile to each other and will each likely use AGI to (at the very least) disempower the other, so if you live in the U.S., first, you hope that AGI doesn’t arrive until alignment is solved, and second, you hope that the U.S. gets it first.
I’m curious what these “more effective words” are. This isn’t asked flippantly. Clearly there is a geopolitical dimension to the AI issue and Zvi lives in the U.S. Even as a rationalist, how should Zvi talk about the issue? China and the U.S. are hostile to each other and will each likely use AGI to (at the very least) disempower the other, so if you live in the U.S., first, you hope that AGI doesn’t arrive until alignment is solved, and second, you hope that the U.S. gets it first.