When asking this question, do you include scenarios where humanity really doesn’t want control and is impressed by the irreproachability of GPTs, doing our best to hand over control to them as fast as possible, even as the GPTs struggle and only try in the sense that they accept whatever tasks are handed to them? Or do the GPTs have to in some way actively attempt to wrestle control from or trick humans?
Yes, that’s what I meant by “takeover.” It’s distinct from ceding control voluntarily.
I do not see humanity ever fully ceding control, as distinct from accepting a lot of help and advice from AIs. Why cede control if you can get all of the upsides without losing the ability to change your mind?
Of course, if you’re accepting all of the advice, you have temporarily ceded control.
But I’m primarily concerned with humanity accidentally losing its choice through the creation of AGI that’s not aligned with the majority of humanity’s interests or desires—the classic alignment question.
What if humanity mistakenly thinks that ceding control voluntarily is temporary, when actually it is permanent because it makes the systems of power less and less adapted to human means of interaction?
When asking this question, do you include scenarios where humanity really doesn’t want control and is impressed by the irreproachability of GPTs, doing our best to hand over control to them as fast as possible, even as the GPTs struggle and only try in the sense that they accept whatever tasks are handed to them? Or do the GPTs have to in some way actively attempt to wrestle control from or trick humans?
Yes, that’s what I meant by “takeover.” It’s distinct from ceding control voluntarily.
I do not see humanity ever fully ceding control, as distinct from accepting a lot of help and advice from AIs. Why cede control if you can get all of the upsides without losing the ability to change your mind?
Of course, if you’re accepting all of the advice, you have temporarily ceded control.
But I’m primarily concerned with humanity accidentally losing its choice through the creation of AGI that’s not aligned with the majority of humanity’s interests or desires—the classic alignment question.
What if humanity mistakenly thinks that ceding control voluntarily is temporary, when actually it is permanent because it makes the systems of power less and less adapted to human means of interaction?