People have complex sets of goals, tendencies, and instincts. There has never been any entity brought into existence so far which is a utility maximizer.
That renders us dangerous if we become too powerful, but we are not useless if our powers are checked.
We really might not wish an AI to be an explicit utility maximizer. Oddly, starting with that design actually might not generate the most utility.
People have complex sets of goals, tendencies, and instincts. There has never been any entity brought into existence so far which is a utility maximizer.
That renders us dangerous if we become too powerful, but we are not useless if our powers are checked.
We really might not wish an AI to be an explicit utility maximizer. Oddly, starting with that design actually might not generate the most utility.