Until I saw this discussion I don’t think I ever (consciously) thought of values (outside the value=price economic view) in the way this discussion seems to cast the light for me. Both the idea of values as units of choice (like goods) and the thought on the fragility of values (systems I think was implied) seem to put me on that line.
When we think about economic crisis (boo-bust cycles, depression events and various fluctuations in patterns of global trade) I wonder if the same is true for the value systems. Both are built up from some unit level type decisions. The units stand in various types of relationships (tight-loose, complementary-substitutes, near-term—far-term) in a relative sense. When anything changes there are ripple effects. Some will be more isolated, some will cascade.
Similarly, in the economic sphere, no one really choose the overall pattern of the economy or structure of production, it’s largely an outcome. The approach of treating values as units and considering the fragility of the future based on any given set of values in place seems very similar.
That would suggest that an AI with a different set of values (and prioritization/valuation of the values in the set) will potentially have large impact. But it also suggests that it might not be able to drive a future it wants over that of what humans want. That perhaps is hopeful.
Until I saw this discussion I don’t think I ever (consciously) thought of values (outside the value=price economic view) in the way this discussion seems to cast the light for me. Both the idea of values as units of choice (like goods) and the thought on the fragility of values (systems I think was implied) seem to put me on that line.
When we think about economic crisis (boo-bust cycles, depression events and various fluctuations in patterns of global trade) I wonder if the same is true for the value systems. Both are built up from some unit level type decisions. The units stand in various types of relationships (tight-loose, complementary-substitutes, near-term—far-term) in a relative sense. When anything changes there are ripple effects. Some will be more isolated, some will cascade.
Similarly, in the economic sphere, no one really choose the overall pattern of the economy or structure of production, it’s largely an outcome. The approach of treating values as units and considering the fragility of the future based on any given set of values in place seems very similar.
That would suggest that an AI with a different set of values (and prioritization/valuation of the values in the set) will potentially have large impact. But it also suggests that it might not be able to drive a future it wants over that of what humans want. That perhaps is hopeful.