And since I don’t particularly want to discourage other people from having children, I decline to discuss my own reasons publicly (or in the vicinity of anyone else who wants kids).
That sounds sufficiently ominous that I’m not quite sure I want kids any more.
Unfortunately, that seems to be a malleable argument. Which way your stating that (you don’t want to disclose your reasons for not wanting to have kids) will influence audiences seems like it will depend heavily on their priors for how generally-valid-to-any-other-person this reason might be, and for how self-motivated both the not-wanting-to-have-kids and the not-wanting-to-discourage-others could be.
Then again, I might be missing some key pieces of context. No offense intended, but I try to make it a point not to follow your actions and gobble up your words personally, even to the point of mind-imaging a computer-generated mental voice when reading the sequences. I’ve already been burned pretty hard by blindly reaching for a role-model I was too fond of.
All that means is that he is aware of the halo effect. People who have enjoyed or learned from his work will give his reasons undue weight as a consequence, even if they don’t actually apply to them.
Obviously his reason is that he wants to personally maximize his time and resources on FAI research. Because not everyone is a seed AI programmer, this reason does not apply to most everyone else. If Eliezer thinks FAI is going to probably take a few decades (which evidence seems to indicate he does), then it probably very well is in the best interest of those rationalists who aren’t themselves FAI researchers to be having kids, so he wouldn’t want to discourage that. (although I don’t see how just explaining this would discourage anybody from having kids who you would otherwise want to.)
That sounds sufficiently ominous that I’m not quite sure I want kids any more.
Shouldn’t you be taking into account that I don’t want to discourage other people from having kids?
That might just be because you eat babies.
Unfortunately, that seems to be a malleable argument. Which way your stating that (you don’t want to disclose your reasons for not wanting to have kids) will influence audiences seems like it will depend heavily on their priors for how generally-valid-to-any-other-person this reason might be, and for how self-motivated both the not-wanting-to-have-kids and the not-wanting-to-discourage-others could be.
Then again, I might be missing some key pieces of context. No offense intended, but I try to make it a point not to follow your actions and gobble up your words personally, even to the point of mind-imaging a computer-generated mental voice when reading the sequences. I’ve already been burned pretty hard by blindly reaching for a role-model I was too fond of.
But you’re afraid that if you state your reason, it will discourage others from having kids.
All that means is that he is aware of the halo effect. People who have enjoyed or learned from his work will give his reasons undue weight as a consequence, even if they don’t actually apply to them.
Obviously his reason is that he wants to personally maximize his time and resources on FAI research. Because not everyone is a seed AI programmer, this reason does not apply to most everyone else. If Eliezer thinks FAI is going to probably take a few decades (which evidence seems to indicate he does), then it probably very well is in the best interest of those rationalists who aren’t themselves FAI researchers to be having kids, so he wouldn’t want to discourage that. (although I don’t see how just explaining this would discourage anybody from having kids who you would otherwise want to.)