If some AGI’s only care about their internal experience and not affecting the outside world, they are basically wireheading.
If a subset of AGI wireheads and some AGIs don’t wirehead the AGIs that don’t wirehead will have all the power over the world. Wireheaded AGIs are also economically useless so people try to develop AGIs that don’t do that.
If some AGI’s only care about their internal experience and not affecting the outside world, they are basically wireheading.
If a subset of AGI wireheads and some AGIs don’t wirehead the AGIs that don’t wirehead will have all the power over the world. Wireheaded AGIs are also economically useless so people try to develop AGIs that don’t do that.
And a subset might value drift towards optimizing the internal experiences of all conscious minds?
That’s a much more complex goal than wireheading for a digital mind that can self-modify.
In any case, those agents that care a lot about getting more power over the world are more likely to get power than agents that don’t.