And I feel like anyone who really has internalized the idea that minds are allowed to fundamentally care about completely different things...
What is questionable is not the possibility of fundamentally different values but that they could accidentally be implemented. What you are suggesting is that some intelligence is able to evolve a vast repertoire of heuristics, acquire vast amounts of knowledge about the universe, dramatically improve its cognitive flexibility and yet never evolve its values but keep its volition at the level of a washing machine. I think this idea is flawed, or at least not sufficiently backed up to take it serious right now. I believe that such an incentive, or any incentive, will have to be deliberately and carefully hardcoded or evolved. Otherwise we are merely talking about grey goo scenarios.
What is questionable is not the possibility of fundamentally different values but that they could accidentally be implemented. What you are suggesting is that some intelligence is able to evolve a vast repertoire of heuristics, acquire vast amounts of knowledge about the universe, dramatically improve its cognitive flexibility and yet never evolve its values but keep its volition at the level of a washing machine. I think this idea is flawed, or at least not sufficiently backed up to take it serious right now. I believe that such an incentive, or any incentive, will have to be deliberately and carefully hardcoded or evolved. Otherwise we are merely talking about grey goo scenarios.