I have also talked with folks who’ve thought a lot about safety and who honestly think that existential risk is lower if we have AI soon (before humanity can harm itself in other ways), for example.
It seems hard to make the numbers come out that way. E.g. suppose human-level AGI in 2030 would cause a 60% chance of existential disaster and a 40% chance of existential disaster becoming impossible, and human-level AGI in 2050 would cause a 50% chance of existential disaster and a 50% chance of existential disaster becoming impossible. Then to be indifferent about AI timelines, conditional on human-level AGI in 2050, you’d have to expect a 1⁄5 probability of existential disaster from other causes in the 2030-2050 period. (That way, with human-level AGI in 2050, you’d have a 1⁄2 * 4⁄5 = 40% chance of surviving, just like with human-level AGI in 2030.) I don’t really know of non-AI risks in the ballpark of 10% per decade.
(My guess at MIRI people’s model is more like 99% chance of existential disaster from human-level AGI in 2030 and 90% in 2050, in which case indifference would require a 90% chance of some other existential disaster in 2030-2050, to cut 10% chance of survival down to 1%.)
It seems hard to make the numbers come out that way. E.g. suppose human-level AGI in 2030 would cause a 60% chance of existential disaster and a 40% chance of existential disaster becoming impossible, and human-level AGI in 2050 would cause a 50% chance of existential disaster and a 50% chance of existential disaster becoming impossible. Then to be indifferent about AI timelines, conditional on human-level AGI in 2050, you’d have to expect a 1⁄5 probability of existential disaster from other causes in the 2030-2050 period. (That way, with human-level AGI in 2050, you’d have a 1⁄2 * 4⁄5 = 40% chance of surviving, just like with human-level AGI in 2030.) I don’t really know of non-AI risks in the ballpark of 10% per decade.
(My guess at MIRI people’s model is more like 99% chance of existential disaster from human-level AGI in 2030 and 90% in 2050, in which case indifference would require a 90% chance of some other existential disaster in 2030-2050, to cut 10% chance of survival down to 1%.)