Because otherwise, you might self-modify into an agent that’s worse at achieving universal instrumental goals than you are now, or one with less achievable terminal goals. Wouldn’t that suck? Be artificial, but do so carefully.
Because otherwise, you might self-modify into an agent that’s worse at achieving universal instrumental goals than you are now, or one with less achievable terminal goals. Wouldn’t that suck? Be artificial, but do so carefully.