I think that past investigators didn’t have good guesses of what the mechanisms are. Most reasoning about human values seems to be of the sort “look at how contextually dependent these ‘values’ things are, and the biases are also a huge mess, I doubt there are simple generators of these preferences”, or “Evolution caused human values in an unpredictable way, and that doesn’t help us figure out alignment.”
E.g. the fact that different humans have relative similar levels of power to each other seems important; we aren’t very aligned to agents much less powerful than us like animals, and I wouldn’t expect a human who had been given all the power in the world all their life such that they’ve learned they can solve any conflict by destroying their opposition to be very aligned.
This reasoning is not about mechanisms. It is analogical. You might still believe the reasoning, and I think it’s at least an a priori relevant observation, but let’s call a spade a spade. This is analogical reasoning to AGI by drawing inferences from select observations (some humans don’t care about less powerful entities) and then inferring that AGI will behave similarly.
(Edited this comment to reduce unintended sharpness)
I think that past investigators didn’t have good guesses of what the mechanisms are. Most reasoning about human values seems to be of the sort “look at how contextually dependent these ‘values’ things are, and the biases are also a huge mess, I doubt there are simple generators of these preferences”, or “Evolution caused human values in an unpredictable way, and that doesn’t help us figure out alignment.”
This reasoning is not about mechanisms. It is analogical. You might still believe the reasoning, and I think it’s at least an a priori relevant observation, but let’s call a spade a spade. This is analogical reasoning to AGI by drawing inferences from select observations (some humans don’t care about less powerful entities) and then inferring that AGI will behave similarly.
(Edited this comment to reduce unintended sharpness)