I am not talking about saving money, I am talking about competent engineering. “Authority” meaning the AI can take an action that has consequences, anything from steering a bus to approving expenses.
To engineer an automated system with authority you need some level of confidence it’s not going to fail, or with AI systems, collude with other AI systems and betray you.
This betrayal risk means you probably will not actually use “goal complete” AI systems in any position of authority without some kind of mitigation for the betrayal.
I am not talking about saving money, I am talking about competent engineering. “Authority” meaning the AI can take an action that has consequences, anything from steering a bus to approving expenses.
To engineer an automated system with authority you need some level of confidence it’s not going to fail, or with AI systems, collude with other AI systems and betray you.
This betrayal risk means you probably will not actually use “goal complete” AI systems in any position of authority without some kind of mitigation for the betrayal.