Does ‘humanness’ require that your history involved growing up in a house with a yard and a dog?
So first, you have to understand that human definitions are fuzzy. That, is, when you say “require,” you are not going about this the right way. For this I recommend Yvain’s post on disease.
As for the substance, I think “growing up” is pretty human. We tend to all follow a similar sort of development. And again, remember that definitions are fuzzy. Someone whose brain never develops after they’re born isn’t necessarily “not human,” they are just much farther form the human norm.
Mental algorithms differences are largely irrelevant.
Insert Searle’s chinese room argument here.
but crucially we actually want future AGIs to have human values!
SIAI doesn’t, at least. When making an AI to self-improve to superintelligence, why make it so that it gets horny?
would a homo sapien child who grows up in complete isolation from other humans and learns only through computers and books not be human?
Didn’t you talk about this in your post, with the child raised by wolves? Relationships with humans are vital for certain sorts of brain development, so an isolated child is much farther from the human norm.
but crucially we actually want future AGIs to have human values!
SIAI doesn’t, at least. When making an AI to self-improve to superintelligence, why make it so that it gets horny?
Exactly human values, not analogous to human values. So if humans value having sex but don’t value FAI having sex then the FAI will value humans having sex but not value having sex itself.
So first, you have to understand that human definitions are fuzzy. That, is, when you say “require,” you are not going about this the right way. For this I recommend Yvain’s post on disease.
As for the substance, I think “growing up” is pretty human. We tend to all follow a similar sort of development. And again, remember that definitions are fuzzy. Someone whose brain never develops after they’re born isn’t necessarily “not human,” they are just much farther form the human norm.
Insert Searle’s chinese room argument here.
SIAI doesn’t, at least. When making an AI to self-improve to superintelligence, why make it so that it gets horny?
Didn’t you talk about this in your post, with the child raised by wolves? Relationships with humans are vital for certain sorts of brain development, so an isolated child is much farther from the human norm.
Exactly human values, not analogous to human values. So if humans value having sex but don’t value FAI having sex then the FAI will value humans having sex but not value having sex itself.