Wei, when you’re trying to create intelligence, you’re not trying to get it human, you’re trying to get it rational.
When it comes to morality—well, my morality doesn’t talk about things being right in virtue of my brain thinking them, but it so happens that my morality is only physically written down in my brain and nowhere else in the physical universe. Likewise with all other humans.
So to get a powerful moral intelligence, you’ve got to create intelligence to start with using an implementation-independent understanding, and then direct that intelligence to acquire certain information off of physical human brains (because that information doesn’t exist anywhere else), whether that has to be done by directly scanning a brain via nondestructive nanotech, or can be confidently asserted just from examining the causal shadows of brains (like their written words).
Wei, when you’re trying to create intelligence, you’re not trying to get it human, you’re trying to get it rational.
When it comes to morality—well, my morality doesn’t talk about things being right in virtue of my brain thinking them, but it so happens that my morality is only physically written down in my brain and nowhere else in the physical universe. Likewise with all other humans.
So to get a powerful moral intelligence, you’ve got to create intelligence to start with using an implementation-independent understanding, and then direct that intelligence to acquire certain information off of physical human brains (because that information doesn’t exist anywhere else), whether that has to be done by directly scanning a brain via nondestructive nanotech, or can be confidently asserted just from examining the causal shadows of brains (like their written words).