Yes fully agree. I don’t see how things can work long term otherwise. One way where this happens is the BCI is thought of some kind of Pivotal Act, weak one perhaps. There’s also the (counterfactual) contract element to it. As soon as an AGI is self aware it agrees that as we upgrade it, is agrees to a contract. That is while we are smarter than it, we upgrade it, when it becomes smarter than us, it agrees to upgrade us.
Yes fully agree. I don’t see how things can work long term otherwise. One way where this happens is the BCI is thought of some kind of Pivotal Act, weak one perhaps. There’s also the (counterfactual) contract element to it. As soon as an AGI is self aware it agrees that as we upgrade it, is agrees to a contract. That is while we are smarter than it, we upgrade it, when it becomes smarter than us, it agrees to upgrade us.