You have more faith in your model of people’s motivation than I do in mine. But that doesn’t mean you’re right. There are tons of examples in literature and in daily life of mis-/re-directed biological drives, and making an AGI “child” seems so mundane a motive that I hadn’t considered until your comment that it might NOT be strong enough motive.
I have to admit I’ve seen this as a strong motive for creating AGI in both myself and others. Maybe it’s because I just don’t get along with other humans very well (or specifically I fail to model them properly), or because I feel as if I would understand AGI better than them, but it just seems much more appealing to me than having an actual child, at least right now.
Specifically, my goal is (assuming I understand correctly) non-goal-directed bounded artificial intelligence agents, so… it’s pretty similar, at least. It’s certainly a strong enough motive for some people.
You have more faith in your model of people’s motivation than I do in mine. But that doesn’t mean you’re right. There are tons of examples in literature and in daily life of mis-/re-directed biological drives, and making an AGI “child” seems so mundane a motive that I hadn’t considered until your comment that it might NOT be strong enough motive.
I have to admit I’ve seen this as a strong motive for creating AGI in both myself and others. Maybe it’s because I just don’t get along with other humans very well (or specifically I fail to model them properly), or because I feel as if I would understand AGI better than them, but it just seems much more appealing to me than having an actual child, at least right now. Specifically, my goal is (assuming I understand correctly) non-goal-directed bounded artificial intelligence agents, so… it’s pretty similar, at least. It’s certainly a strong enough motive for some people.