I think the substance of my views can be mostly summarized as:
AI takeover is a real thing that could happen, not an exotic or implausible scenario.
By the time we build powerful AI, the world will likely be moving fast enough that a lot of stuff will happen within the next 10 years.
I think that the world is reasonably robust against extinction but not against takeover or other failures (for which there is no outer feedback loop keeping things on the rails).
I don’t think my credences add very much except as a way of quantifying that basic stance. I largely made this post to avoid confusion after quoting a few numbers on a podcast and seeing some people misinterpret them.
I think the substance of my views can be mostly summarized as:
AI takeover is a real thing that could happen, not an exotic or implausible scenario.
By the time we build powerful AI, the world will likely be moving fast enough that a lot of stuff will happen within the next 10 years.
I think that the world is reasonably robust against extinction but not against takeover or other failures (for which there is no outer feedback loop keeping things on the rails).
I don’t think my credences add very much except as a way of quantifying that basic stance. I largely made this post to avoid confusion after quoting a few numbers on a podcast and seeing some people misinterpret them.
Yepp, agree with all that.