tried to make sure the universe kept on having conditions that would produce some things like humans.
I would submit that from the point of view of the ancestor species we displaced, we (homo sapiens) were the equivalent of UAI. We were a superior intelligence which was unconstrained by a set of values that supported our ancestor species. We tiled the planet with copies of ourselves robbing especially our immediate ancestors (who tended to occupy similar niches as us) of resources and making them extinct with our success.
So a universe that has the condition to produce some things like humans, that is “the state of nature” from which UAI will arise and, if they are as good as we are afraid they are, supplant humans as the dominant species.
I think this is the thinking behind all jokes along the lines of “I, for one, welcome our new robot overlords.”
So a universe that has the condition to produce some things like humans, that is “the state of nature” from which UAI will arise and, if they are as good as we are afraid they are, supplant humans as the dominant species.
That’s the goal. What, you want there to be humans a million years from now?
That’s the goal. What, you want there to be humans a million years from now?
Is that true, or are you just being cleverly sarcastic? If that is the goal of CEV, could you point me to something written up on CEV where I might see this aspect of it?
I would submit that from the point of view of the ancestor species we displaced, we (homo sapiens) were the equivalent of UAI. We were a superior intelligence which was unconstrained by a set of values that supported our ancestor species. We tiled the planet with copies of ourselves robbing especially our immediate ancestors (who tended to occupy similar niches as us) of resources and making them extinct with our success.
So a universe that has the condition to produce some things like humans, that is “the state of nature” from which UAI will arise and, if they are as good as we are afraid they are, supplant humans as the dominant species.
I think this is the thinking behind all jokes along the lines of “I, for one, welcome our new robot overlords.”
That’s the goal. What, you want there to be humans a million years from now?
Is that true, or are you just being cleverly sarcastic? If that is the goal of CEV, could you point me to something written up on CEV where I might see this aspect of it?
I mean, that’s the goal of anyone with morals like mine, rather than just nepotism.