Well, consider this: it takes only a very small functional change to the human brain to make ‘raising it as a human child’ a questionable strategy at best. Crippling a few features of the brain produces sociopaths who, notably, cannot be reliably inculcated with our values, despite sharing 99.99etc% of our own neurological architecture.
Mind space is a tricky thing to pin down in a useful way, so let’s just say the bubble is really tiny. If the changes your making are larger than the changes between a sociopath and a neurotypical human, then you shouldn’t employ this strategy. Trying to use it on any kind of denovo AI without anything analagous to our neurons is foolhardy beyond belief. So much of our behavior is predated on things that aren’t and can’t be learned, and trying to program all of those qualities and intuitions by hand so that the AI can be properly taught our value scheme looks broadly isomorphic to the FAI problem.
Well, consider this: it takes only a very small functional change to the human brain to make ‘raising it as a human child’ a questionable strategy at best. Crippling a few features of the brain produces sociopaths who, notably, cannot be reliably inculcated with our values, despite sharing 99.99etc% of our own neurological architecture.
Mind space is a tricky thing to pin down in a useful way, so let’s just say the bubble is really tiny. If the changes your making are larger than the changes between a sociopath and a neurotypical human, then you shouldn’t employ this strategy. Trying to use it on any kind of denovo AI without anything analagous to our neurons is foolhardy beyond belief. So much of our behavior is predated on things that aren’t and can’t be learned, and trying to program all of those qualities and intuitions by hand so that the AI can be properly taught our value scheme looks broadly isomorphic to the FAI problem.