~ AGI safety is at LEAST as hard as a protocol which prevents tyranny ~
When we want to keep ourselves safe from AGI, then one of the key criteria is “can we turn it off if it tries to go berserk?” That is the same requirement whenever we put some person in charge of an entire government: “can we depose this person, without them using the military to stop us?”
AGI has extra risks and problems, being super-intelligent, while most dictators are morons. Yet, if someone told me “we solved AGI safety!” then I would happily announce “then you’ve also solved protecting-governments-from-dictators!” You might be able to avoid all dictators in a way which does NOT cure us of AGI risks… though, if you reliably prevent AGI-pocalypse, then you’ve definitely handled dictators, using the same method.
So, does that mean we’ll soon find panacea to AGI-death threats? Considering that we haven’t stopped dictators in the last few… centuries? Millenia? Yeah, we might be screwed. Considering that dictators have nukes, and can engineer super-viruses… Oh, and that would imply: “Dictators are the existential risk of ‘a berserk machine we can’t turn-off’… meaning that we need to fight those AGI overlords today.”
O(“AGI Safety”)>O(“Stop Tyrants”)
~ AGI safety is at LEAST as hard as a protocol which prevents tyranny ~
When we want to keep ourselves safe from AGI, then one of the key criteria is “can we turn it off if it tries to go berserk?” That is the same requirement whenever we put some person in charge of an entire government: “can we depose this person, without them using the military to stop us?”
AGI has extra risks and problems, being super-intelligent, while most dictators are morons. Yet, if someone told me “we solved AGI safety!” then I would happily announce “then you’ve also solved protecting-governments-from-dictators!” You might be able to avoid all dictators in a way which does NOT cure us of AGI risks… though, if you reliably prevent AGI-pocalypse, then you’ve definitely handled dictators, using the same method.
So, does that mean we’ll soon find panacea to AGI-death threats? Considering that we haven’t stopped dictators in the last few… centuries? Millenia? Yeah, we might be screwed. Considering that dictators have nukes, and can engineer super-viruses… Oh, and that would imply: “Dictators are the existential risk of ‘a berserk machine we can’t turn-off’… meaning that we need to fight those AGI overlords today.”