In a scenario where multiple AIs compete for power the AIs who makes fast decisions without checking back with humans have an advantage in the power competition and are going to get more power over time.
Additionally, AGI differ fundamentally from humans because the can spin up multiple copies of themselves when they get more resources while human beings can’t similarly scale their power when they have access to more food.
The best human hacker can’t run a cyber war alone but if he could spin of 100,000 copies of themselves he could find enough 0 days to hack into all important computer systems.
In a scenario where multiple AIs compete for power the AIs who makes fast decisions without checking back with humans have an advantage in the power competition and are going to get more power over time.
Agreed this is a risk, but I wouldn’t call this an alignment roadblock.
In a scenario where multiple AIs compete for power the AIs who makes fast decisions without checking back with humans have an advantage in the power competition and are going to get more power over time.
Additionally, AGI differ fundamentally from humans because the can spin up multiple copies of themselves when they get more resources while human beings can’t similarly scale their power when they have access to more food.
The best human hacker can’t run a cyber war alone but if he could spin of 100,000 copies of themselves he could find enough 0 days to hack into all important computer systems.
Agreed this is a risk, but I wouldn’t call this an alignment roadblock.