To the extent people now don’t care about the long-term future there isn’t much to do in terms of long-term alignment. People right now who care about what happens 2000 years from now probably have roughly similar preferences to people 1000 years from now who aren’t significantly biologically changed or cognitively enhanced, because some component of what people care about is biological.
I’m not saying it would be random so much as not very dependent on the original history of humans used to train early AGI iterations. It would have different data history but part of that is because of different measurements, e.g. scientific measuring tools. Different ontology means that value laden things people might care about like “having good relationships with other humans” are not meaningful things to future AIs in terms of their world model, not something they would care much by default (they aren’t even modeling the world in those terms), and it would be hard to encode a utility function so they care about it despite the ontological difference.
To the extent people now don’t care about the long-term future there isn’t much to do in terms of long-term alignment. People right now who care about what happens 2000 years from now probably have roughly similar preferences to people 1000 years from now who aren’t significantly biologically changed or cognitively enhanced, because some component of what people care about is biological.
I’m not saying it would be random so much as not very dependent on the original history of humans used to train early AGI iterations. It would have different data history but part of that is because of different measurements, e.g. scientific measuring tools. Different ontology means that value laden things people might care about like “having good relationships with other humans” are not meaningful things to future AIs in terms of their world model, not something they would care much by default (they aren’t even modeling the world in those terms), and it would be hard to encode a utility function so they care about it despite the ontological difference.