There are very “large” impacts to which we are completely indifferent (chaotic weather changes, the above-mentioned change in planetary orbits, the different people being born as a consequence of different people meeting and dating across the world, etc.) and other, smaller, impacts that we care intensely about (the survival of humanity, of people’s personal wealth, of certain values and concepts going forward, key technological innovations being made or prevented, etc.)
I don’t think we are indifferent to these outcomes. We leave them to luck, but that’s a fact about our limited capabilities, not about our values. If we had enough control over “chaotic weather changes” to steer a hurricane away from a coastal city, we would very much care about it. So if a strong AI can reason through these impacts, it suddenly faces a harder task than a human: “I’d like this apple to fall from the table, and I see that running the fan for a few minutes will achieve that goal, but that’s due to subtly steering a hurricane and we can’t have that”.
Yes, but we would be mostly indifferent to shifts in the distribution that preserve most of the features—eg if the weather was the same but delayed or advanced by six days.
I don’t think we are indifferent to these outcomes. We leave them to luck, but that’s a fact about our limited capabilities, not about our values. If we had enough control over “chaotic weather changes” to steer a hurricane away from a coastal city, we would very much care about it. So if a strong AI can reason through these impacts, it suddenly faces a harder task than a human: “I’d like this apple to fall from the table, and I see that running the fan for a few minutes will achieve that goal, but that’s due to subtly steering a hurricane and we can’t have that”.
Yes, but we would be mostly indifferent to shifts in the distribution that preserve most of the features—eg if the weather was the same but delayed or advanced by six days.