Maybe my theory of goodness is so wrong that the importance of consultation and choice will turn out to be hilariously and amusingly false, but… I’m pretty sure… not.
To try to steelman the other side, people don’t ask for consultation for things they are very very certain will be viewed as positive. It’s not immoral not to consult you before I give you a billion dollars, similarly a future AI might have a model of humanity so good that it can predict our choices with immense accuracy, in which case actually consulting humanity would just be needlessly wasting away precious negentropy while the humans spend months figuring out the choice the AI already knows they will pick.
To try to steelman the other side, people don’t ask for consultation for things they are very very certain will be viewed as positive. It’s not immoral not to consult you before I give you a billion dollars, similarly a future AI might have a model of humanity so good that it can predict our choices with immense accuracy, in which case actually consulting humanity would just be needlessly wasting away precious negentropy while the humans spend months figuring out the choice the AI already knows they will pick.