I absolutely agree that “wanting kids” is inner not outer, as is “not wanting kids” or “liking sex”. The question was how well they are aligned with the outer optimizer’s goals along the lines of “have your heritable traits survive for as long as possible”.
I somewhat agree with the original post that the inner goals are actually not as misaligned with the outer goals as they might superficially seem. Even inventing birth control so as to have more non-productive sex without having to take care of a lot of children can be beneficial for the outer goal than not inventing or using birth control.
The biggest flaw with the evolution=outer, culture/thoughts=inner analogy in general though is that the time and scope scales for evolution outer optimization are drastically larger than the timescale of any inner optimizers we might have. When we’re considering AGI inner/outer misalignment, they won’t be anywhere near so different.
I absolutely agree that “wanting kids” is inner not outer, as is “not wanting kids” or “liking sex”. The question was how well they are aligned with the outer optimizer’s goals along the lines of “have your heritable traits survive for as long as possible”.
I somewhat agree with the original post that the inner goals are actually not as misaligned with the outer goals as they might superficially seem. Even inventing birth control so as to have more non-productive sex without having to take care of a lot of children can be beneficial for the outer goal than not inventing or using birth control.
The biggest flaw with the evolution=outer, culture/thoughts=inner analogy in general though is that the time and scope scales for evolution outer optimization are drastically larger than the timescale of any inner optimizers we might have. When we’re considering AGI inner/outer misalignment, they won’t be anywhere near so different.