We already have an entity exactly the same as Hibbard. Namely: Hibbard. Why do we need another one?
What we want is an AI that’s far more intelligent than a human, yet shares their values. Increasing intelligence while preserving values is nontrivial. You could try giving Hibbard the ability to self-modify, but then he’d most likely just go insane in some way or another.
I don’t really doubt that increasing value while preserving values is nontrivial, but I wonder just how nontrivial it is: are the regions of the brain for intelligence and values separate? Actually, writing that out, I realize that (at least for me) values are a “subset” of intelligence: the “facts” we believe about science/math/logic/religion are generated in basically the same way as our moral values; the difference to us humans seems obvious, but it really is, well, nontrivial. The paper clip maximizing AI is a good example: even if it wasn’t about “moral values”—even if you wanted to maximize something like paper clips—you’d still run into trouble
We already have an entity exactly the same as Hibbard. Namely: Hibbard. Why do we need another one?
What we want is an AI that’s far more intelligent than a human, yet shares their values. Increasing intelligence while preserving values is nontrivial. You could try giving Hibbard the ability to self-modify, but then he’d most likely just go insane in some way or another.
I don’t really doubt that increasing value while preserving values is nontrivial, but I wonder just how nontrivial it is: are the regions of the brain for intelligence and values separate? Actually, writing that out, I realize that (at least for me) values are a “subset” of intelligence: the “facts” we believe about science/math/logic/religion are generated in basically the same way as our moral values; the difference to us humans seems obvious, but it really is, well, nontrivial. The paper clip maximizing AI is a good example: even if it wasn’t about “moral values”—even if you wanted to maximize something like paper clips—you’d still run into trouble