I don’t understand what that means. Goals aren’t stored and then activated or not...
AIs which are smart enough to self-improve without changing its goals
You seem to think that anything sufficiently intelligent will only improve in goal-stable fashion. I don’t see why that should be true.
For a data point, a bit of reflection tells me that if I were able to boost my intelligence greatly, I would not care about goal stability much. Everything changes—that’s how reality works.
On your last paragraph… do you mean that you expect your material-level preferences concerning the future to change? Of course they would. But would you really expect that a straight-up intelligence boost would change the axioms governing what sorts of futures you prefer?
But would you really expect that a straight-up intelligence boost would change the axioms governing what sorts of futures you prefer?
Two answers. First is that yes, I expect that a sufficiently large intelligence boost would change my terminal values. Second is that even without the boost I, in my current state, do not seek to change only in a goal-stable way.
I think that that only seems to make sense because you don’t know what your terminal values are. If you did, I suspect you would be a little more attached to them.
I don’t understand what that means. Goals aren’t stored and then activated or not...
You seem to think that anything sufficiently intelligent will only improve in goal-stable fashion. I don’t see why that should be true.
For a data point, a bit of reflection tells me that if I were able to boost my intelligence greatly, I would not care about goal stability much. Everything changes—that’s how reality works.
On your last paragraph… do you mean that you expect your material-level preferences concerning the future to change? Of course they would. But would you really expect that a straight-up intelligence boost would change the axioms governing what sorts of futures you prefer?
Two answers. First is that yes, I expect that a sufficiently large intelligence boost would change my terminal values. Second is that even without the boost I, in my current state, do not seek to change only in a goal-stable way.
I think that that only seems to make sense because you don’t know what your terminal values are. If you did, I suspect you would be a little more attached to them.