Can we join the race to create dangerous AGI in a way that attempts to limit the damage it can cause, but allowing it to cause enough damage to move other pivotal acts into the Overton window?
If the first AGI created is designed to give the world a second chance, it may be able to convince the world that a second chance should not happen. Obviously this could fail and just end the world earlier, but it would certainly create a convincing argument.
In the early days of the pandemic, even though all the evidence was there, virtually no one cared about covid until it was knocking on their door, and then suddenly pandemic preparedness seemed like the most obvious thing to everyone.
Can we join the race to create dangerous AGI in a way that attempts to limit the damage it can cause, but allowing it to cause enough damage to move other pivotal acts into the Overton window?
If the first AGI created is designed to give the world a second chance, it may be able to convince the world that a second chance should not happen. Obviously this could fail and just end the world earlier, but it would certainly create a convincing argument.
In the early days of the pandemic, even though all the evidence was there, virtually no one cared about covid until it was knocking on their door, and then suddenly pandemic preparedness seemed like the most obvious thing to everyone.