It’s a bit ironic this written from the perspective of “alignment” because a human inclination towards IGF is exactly the sort of loop that gets thrown when you try to align AGI with human moral intuition. A lot of our moral intuition has to do with what’s good for our children (IGF) and community due to reciprocity (so IGF again). You’re never going to be able to align AGI with human moral intuition without factoring in that humans care a lot of about IGF (ultimately, not proximately.)
Personally, I think if you try to better align humans to maximise IGF you’ll find a lot of morally repulsive exploits in the system. Probably the human in history who maximised IGF the most was Ghengis Khan and that sure involved a lot of rape and murder. (It’s probably not an ESS though)
It’s a bit ironic this written from the perspective of “alignment” because a human inclination towards IGF is exactly the sort of loop that gets thrown when you try to align AGI with human moral intuition. A lot of our moral intuition has to do with what’s good for our children (IGF) and community due to reciprocity (so IGF again). You’re never going to be able to align AGI with human moral intuition without factoring in that humans care a lot of about IGF (ultimately, not proximately.)
Personally, I think if you try to better align humans to maximise IGF you’ll find a lot of morally repulsive exploits in the system. Probably the human in history who maximised IGF the most was Ghengis Khan and that sure involved a lot of rape and murder. (It’s probably not an ESS though)