It seems to me all the more basic desires (“values”), e.g. the lower layers of Maslow’s hierarchy of needs, are mainly determined by heritable factors. Because they are relatively stable across cultures. So presumably you talk about “higher” values being a function of “data sources in the world”? I.e. of nurture rather than nature?
I agree there probably are some heritable values, though my big difference here is that I think that the set of primitive values is quite a bit smaller than you might think.
Though be warned, heritability doesn’t actually answer our question, because the way it’s interpreted by laymen is pretty wrong:
I probably should have separated formal ethical theories that people are describing, which you call morals and what their values actually are more.
I was always referring to values when I was talking about morals.
You are correct that someone describing a moral theory doesn’t mean that they actually agree with or implement the theory.
I still think that if you had the amount of control over a human that an ML person had over an AI today, you could brainwash them to value ~arbitrary values with a lot of control, and it would be the central technology of political and social situations, which is a lot.
I agree there probably are some heritable values, though my big difference here is that I think that the set of primitive values is quite a bit smaller than you might think.
Though be warned, heritability doesn’t actually answer our question, because the way it’s interpreted by laymen is pretty wrong:
https://www.lesswrong.com/posts/YpsGjsfT93aCkRHPh/what-does-knowing-the-heritability-of-a-trait-tell-me-in
I probably should have separated formal ethical theories that people are describing, which you call morals and what their values actually are more.
I was always referring to values when I was talking about morals.
You are correct that someone describing a moral theory doesn’t mean that they actually agree with or implement the theory.
I still think that if you had the amount of control over a human that an ML person had over an AI today, you could brainwash them to value ~arbitrary values with a lot of control, and it would be the central technology of political and social situations, which is a lot.