Personally, I want to get to the glorious transhumanist future as soon as possible as much as anybody, but if there’s a chance that AI kills us all instead, that’s good enough for me to say we should be hitting pause on it.
I don’t wanna pull the meme phrase on people here, but if it’s ever going to be said, now’s the time: “Won’t somebody please think of the children?”
It depends at what rate the chance can be decreased. If it takes 50 years to shrink it from 1% to 0.1%, then with all the people that would die in that time, I’d probably be willing to risk it.
As of right now, even the most optimistic experts I’ve seen put p(doom) at much higher than 1% - far into the range where I vote to hit pause.
Personally, I want to get to the glorious transhumanist future as soon as possible as much as anybody, but if there’s a chance that AI kills us all instead, that’s good enough for me to say we should be hitting pause on it.
I don’t wanna pull the meme phrase on people here, but if it’s ever going to be said, now’s the time: “Won’t somebody please think of the children?”
Any chance? A one in a million chance? 1e-12? At some point you should take the chance. What is your Faust parameter?
It depends at what rate the chance can be decreased. If it takes 50 years to shrink it from 1% to 0.1%, then with all the people that would die in that time, I’d probably be willing to risk it.
As of right now, even the most optimistic experts I’ve seen put p(doom) at much higher than 1% - far into the range where I vote to hit pause.