We have to cope with humanity as it is, at least until there’s an AI overlord. Ignoring the other counterarguments for a moment, does voluntarily stopping economic expansion and halting technological progress sound like something humanity is capable of doing?
I agree, humanity is probably not capable of stopping economic growth. And personally I’m glad of that, because I would choose UFAI over humanity in its current form (partly because in this form humanity is likely to destroy itself before long).
The point I was making is that if you value UFAI as negatively as EY does, maybe avoiding UFAI is worth avoiding all AI.
We have to cope with humanity as it is, at least until there’s an AI overlord. Ignoring the other counterarguments for a moment, does voluntarily stopping economic expansion and halting technological progress sound like something humanity is capable of doing?
I agree, humanity is probably not capable of stopping economic growth. And personally I’m glad of that, because I would choose UFAI over humanity in its current form (partly because in this form humanity is likely to destroy itself before long).
The point I was making is that if you value UFAI as negatively as EY does, maybe avoiding UFAI is worth avoiding all AI.