I think my immediate objection to Robin’s take can be summarized with a classic “shit rationalists say” quote: “You have no idea how BIG mindspace is.” Sure, we quibble over how much it’s okay to impose our values on our descendants, but those value differences are about (warning: tongue firmly in cheek) relatively trivial things like “what fictional deity to believe in” or “whether it’s okay to kill our enemies, and who they are”. Granted, Robin talks about acceleration of value change, but value differences like “whether to convert the Earth and its occupants into computronium” seem like a significant discontinuity with previous value differences among generations of humans.
Humans may have different values from their ancestors, but it is not typical for them to have both the capability and the desire (or lack of inhibition) to exterminate those ancestors. If it was, presumably “value totalitarianism” would be a lot more popular. Perhaps Robin doesn’t believe that AGI would have such a large value difference with humans; but in that case we’ve come back into the realm of a factual dispute, rather than a philosophical one.
(It does seem like a very Robin Hanson take to end by saying: Although my opponents advocate committing what I clearly suggest should be considered an atrocity against their descendants, “[e]ven so, I must admit that [it] deserves to be among the range of responses considered to future intergenerational conflicts.”)
I think my immediate objection to Robin’s take can be summarized with a classic “shit rationalists say” quote: “You have no idea how BIG mindspace is.” Sure, we quibble over how much it’s okay to impose our values on our descendants, but those value differences are about (warning: tongue firmly in cheek) relatively trivial things like “what fictional deity to believe in” or “whether it’s okay to kill our enemies, and who they are”. Granted, Robin talks about acceleration of value change, but value differences like “whether to convert the Earth and its occupants into computronium” seem like a significant discontinuity with previous value differences among generations of humans.
Humans may have different values from their ancestors, but it is not typical for them to have both the capability and the desire (or lack of inhibition) to exterminate those ancestors. If it was, presumably “value totalitarianism” would be a lot more popular. Perhaps Robin doesn’t believe that AGI would have such a large value difference with humans; but in that case we’ve come back into the realm of a factual dispute, rather than a philosophical one.
(It does seem like a very Robin Hanson take to end by saying: Although my opponents advocate committing what I clearly suggest should be considered an atrocity against their descendants, “[e]ven so, I must admit that [it] deserves to be among the range of responses considered to future intergenerational conflicts.”)