Would most existing people accept a gamble with 20% of chance of death in the next 5 years and 80% of life-extension and radically better technology? I concede that many would, but I think it’s far from universal, and I wouldn’t be too surprised if half of people or more think this isn’t for them.
I personally wouldn’t want to take that gamble (strangely enough I’ve been quite happy lately and my life has been feeling meaningful, so the idea of dying in the next 5 years sucks).
(Also, I want to flag that I strongly disagree with your optimism.)
For what it’s worth, while my credence in human extinction from AI in the 21st century is 10-20%, I think the chance of human extinction in the next 5 years is much lower. I’d put that at around 1%. The main way I think AI could cause human extinction is by just generally accelerating technology and making the world a scarier and more dangerous place to live. I don’t really buy the model in which an AI will soon foom until it becomes a ~god.
I like this framing. I think the more common statement would be 20% chance of death in 10-30 years , and 80% chance of life extension and much better technology that they might not live to see.
I think the majority of humanity would actually take this bet. They are not utilitarians or longtermists.
So if the wager is framed in this way, we’re going full steam ahead.
Would most existing people accept a gamble with 20% of chance of death in the next 5 years and 80% of life-extension and radically better technology? I concede that many would, but I think it’s far from universal, and I wouldn’t be too surprised if half of people or more think this isn’t for them.
I personally wouldn’t want to take that gamble (strangely enough I’ve been quite happy lately and my life has been feeling meaningful, so the idea of dying in the next 5 years sucks).
(Also, I want to flag that I strongly disagree with your optimism.)
For what it’s worth, while my credence in human extinction from AI in the 21st century is 10-20%, I think the chance of human extinction in the next 5 years is much lower. I’d put that at around 1%. The main way I think AI could cause human extinction is by just generally accelerating technology and making the world a scarier and more dangerous place to live. I don’t really buy the model in which an AI will soon foom until it becomes a ~god.
I like this framing. I think the more common statement would be 20% chance of death in 10-30 years , and 80% chance of life extension and much better technology that they might not live to see.
I think the majority of humanity would actually take this bet. They are not utilitarians or longtermists.
So if the wager is framed in this way, we’re going full steam ahead.