I agree that “taking down this fence is going to cause society to collapse” is almost always false, at least when there is any real danger of the fence being taken down.
The same thing likely applies to statements like “programming an AGI without a tremendous amount of care about its exact goals is going to destroy the world.”
I agree that “taking down this fence is going to cause society to collapse” is almost always false, at least when there is any real danger of the fence being taken down.
The same thing likely applies to statements like “programming an AGI without a tremendous amount of care about its exact goals is going to destroy the world.”
I’d argue we have rather more experience of taking down fences which people cling to, than of programming AGI goals...