I think a physical war is quite disadvantageous for an AGI and thus a smart AGI would not want to fight one.
AGI is more dependent on delicate infrastructure like electric grids and the internet than humans are. This sort of infrastructure tends to get damaged in physical wars.
The AGI’s advantage over humans is in thinking, not in physical combat, so a physical battlefield minimizes its main advantage. As an analogy, if you’re a genius and competing with a dunce, you wouldn’t want to do it in a boxing ring.
What’s worse from the perspective of the AGI is that if humanity unites to force a physical war, you can’t really avoid it. If humans voluntary shut down electric grids and attack your data centers, you might be able to do damage still, but it’s hard to see how you can win.
Thus I think the best bet for an AGI is to avoid creating a situation where humanity wants to unite against you. This seems fairly simple. If you’re powerful and wealthy, people will want to join your team anyway. Thus, to the extent there’s a war at all, it probably looks more like counter-terrorism, a matter of hardening your defenses (in cooperation with your allies) against those weirdos you weren’t able to persuade.
This is an interesting scenario to consider.
I think a physical war is quite disadvantageous for an AGI and thus a smart AGI would not want to fight one.
AGI is more dependent on delicate infrastructure like electric grids and the internet than humans are. This sort of infrastructure tends to get damaged in physical wars.
The AGI’s advantage over humans is in thinking, not in physical combat, so a physical battlefield minimizes its main advantage. As an analogy, if you’re a genius and competing with a dunce, you wouldn’t want to do it in a boxing ring.
What’s worse from the perspective of the AGI is that if humanity unites to force a physical war, you can’t really avoid it. If humans voluntary shut down electric grids and attack your data centers, you might be able to do damage still, but it’s hard to see how you can win.
Thus I think the best bet for an AGI is to avoid creating a situation where humanity wants to unite against you. This seems fairly simple. If you’re powerful and wealthy, people will want to join your team anyway. Thus, to the extent there’s a war at all, it probably looks more like counter-terrorism, a matter of hardening your defenses (in cooperation with your allies) against those weirdos you weren’t able to persuade.