The majority of people choose to make non-goal-directed uncontrolled natural-intelligence agents. It seems likely that as general AI becomes feasible, this drive to procreate will motivate at least some to create such a thing.
It doesn’t seem likely to me. People don’t procreate in order to fulfil the abstract definition you gave, they procreate to fulfil biological urges and cultural mores.
You have more faith in your model of people’s motivation than I do in mine. But that doesn’t mean you’re right. There are tons of examples in literature and in daily life of mis-/re-directed biological drives, and making an AGI “child” seems so mundane a motive that I hadn’t considered until your comment that it might NOT be strong enough motive.
I have to admit I’ve seen this as a strong motive for creating AGI in both myself and others. Maybe it’s because I just don’t get along with other humans very well (or specifically I fail to model them properly), or because I feel as if I would understand AGI better than them, but it just seems much more appealing to me than having an actual child, at least right now.
Specifically, my goal is (assuming I understand correctly) non-goal-directed bounded artificial intelligence agents, so… it’s pretty similar, at least. It’s certainly a strong enough motive for some people.
The majority of people choose to make non-goal-directed uncontrolled natural-intelligence agents. It seems likely that as general AI becomes feasible, this drive to procreate will motivate at least some to create such a thing.
It doesn’t seem likely to me. People don’t procreate in order to fulfil the abstract definition you gave, they procreate to fulfil biological urges and cultural mores.
You have more faith in your model of people’s motivation than I do in mine. But that doesn’t mean you’re right. There are tons of examples in literature and in daily life of mis-/re-directed biological drives, and making an AGI “child” seems so mundane a motive that I hadn’t considered until your comment that it might NOT be strong enough motive.
I have to admit I’ve seen this as a strong motive for creating AGI in both myself and others. Maybe it’s because I just don’t get along with other humans very well (or specifically I fail to model them properly), or because I feel as if I would understand AGI better than them, but it just seems much more appealing to me than having an actual child, at least right now. Specifically, my goal is (assuming I understand correctly) non-goal-directed bounded artificial intelligence agents, so… it’s pretty similar, at least. It’s certainly a strong enough motive for some people.