My current feeling is that the opposite, long timelines and hard takeoff, has the best chance of going well.
The main advantage of short timelines is that it makes an immediately-fatal hard takeoff less likely, as there is presumably less overhang now than in the future. It perhaps also reduces the number of players, as presumably it’s easier to join the game as tech improves, so there may never be fewer than there are now. It also has the advantage of maybe saving the lives of those too old to make it to a longer timeline. However, I think the overhang is already dangerously large, and probably was years ago, so I don’t think this is helping (probably).
The main advantage of a soft takeoff is that we might be able to get feedback and steer it as the takeoff happens, perhaps reducing the risk of a critical error. It also increases the chances of a multipolar scenario, where there is an economy of competing AIs. If we don’t like some of the gods we build, perhaps others will be more friendly, or will at least be able to stalemate the bad ones before they kill everyone.
However, I think a multipolar scenario (while unlikely to last even in a soft takeoff) is very dangerous. I don’t think the long-term incentives are favorable to human survival, for two reasons: First is Bostrom’s Black Marble scenario (he’s also called them “black balls”, but that already means something else): Every new technology discovered has a chance of destroying us, especially if we lack the coordination to abstain from using it. In a multipolar world, we lack that coordination. Hostile AIs may recklessly pursue dangerous research or threaten doomsday to blackmail the world into getting what they want, and it is game-theoretically advantageous for them to do this in such a way that they proveably can’t change their minds and not destroy the world if we call the bluff (i.e. defiantly rip off the steering wheel in a game of chicken.) Second, we’ll eventually fall into Malthusian/Molochean traps if we’re unable to coordinate to avoid them. AIs willing to burn everything else for their simple goals will simply outcompete those with more to protect.
Longer timelines give us more time to work on alignment, and on improving our capabilities to do so. A new generation could be educated. Brain-computer interfaces may be developed enough to enhance human intelligence. Human genetic engineering or iterated embryo selection might also improve the number of humans able to do the necessary research. We can still try cryonics (or cheaper chemical fixation) for the old people, although that can’t realistically work for everyone. Not everyone can afford it, not everyone believes it has a chance of working (or they already believe in a false afterlife), and accidents will destroy brains before they can be preserved.
And finally, a hard takeoff gives us a singleton, with high likelyhood. It likely avoids the s-risks that are more probable in multipolar scenarios (even if it simply kills everyone), and (if we survive) lets us finally kill Moloch and gives us the best possible coordination power to survive Black Marbles and eventual alien encounters.
My current feeling is that the opposite, long timelines and hard takeoff, has the best chance of going well.
The main advantage of short timelines is that it makes an immediately-fatal hard takeoff less likely, as there is presumably less overhang now than in the future. It perhaps also reduces the number of players, as presumably it’s easier to join the game as tech improves, so there may never be fewer than there are now. It also has the advantage of maybe saving the lives of those too old to make it to a longer timeline. However, I think the overhang is already dangerously large, and probably was years ago, so I don’t think this is helping (probably).
The main advantage of a soft takeoff is that we might be able to get feedback and steer it as the takeoff happens, perhaps reducing the risk of a critical error. It also increases the chances of a multipolar scenario, where there is an economy of competing AIs. If we don’t like some of the gods we build, perhaps others will be more friendly, or will at least be able to stalemate the bad ones before they kill everyone.
However, I think a multipolar scenario (while unlikely to last even in a soft takeoff) is very dangerous. I don’t think the long-term incentives are favorable to human survival, for two reasons: First is Bostrom’s Black Marble scenario (he’s also called them “black balls”, but that already means something else): Every new technology discovered has a chance of destroying us, especially if we lack the coordination to abstain from using it. In a multipolar world, we lack that coordination. Hostile AIs may recklessly pursue dangerous research or threaten doomsday to blackmail the world into getting what they want, and it is game-theoretically advantageous for them to do this in such a way that they proveably can’t change their minds and not destroy the world if we call the bluff (i.e. defiantly rip off the steering wheel in a game of chicken.) Second, we’ll eventually fall into Malthusian/Molochean traps if we’re unable to coordinate to avoid them. AIs willing to burn everything else for their simple goals will simply outcompete those with more to protect.
Longer timelines give us more time to work on alignment, and on improving our capabilities to do so. A new generation could be educated. Brain-computer interfaces may be developed enough to enhance human intelligence. Human genetic engineering or iterated embryo selection might also improve the number of humans able to do the necessary research. We can still try cryonics (or cheaper chemical fixation) for the old people, although that can’t realistically work for everyone. Not everyone can afford it, not everyone believes it has a chance of working (or they already believe in a false afterlife), and accidents will destroy brains before they can be preserved.
And finally, a hard takeoff gives us a singleton, with high likelyhood. It likely avoids the s-risks that are more probable in multipolar scenarios (even if it simply kills everyone), and (if we survive) lets us finally kill Moloch and gives us the best possible coordination power to survive Black Marbles and eventual alien encounters.