We shouldn’t need to fully specify what we actually want, if we’re building a specialized machine to e.g. cure world hunger or design better integrated circuits.
What if we’re building a specialized machine to prevent a superintelligence from annihilating us?
What if we’re building a specialized machine to prevent a superintelligence from annihilating us?