From the perspective of the God of Evolution, we are the unfriendly AI:
We were supposed to be compelled to reproduce, but we figure out that we can get the reward by disabling our reproductive functions and continuing to go through the motions.
We were supposed to seek out nutritious food and eat it, but we figured out that we could concentrate the parts that trigger our reward centers and just eat that.
And of course, we’re unfriendly to everything else too:
Humans fight each other over farmland (= land that can be turned into food which can be turned into humans) all the time
We’re trying to tile the universe with human colonies and probes. It’s true that we’re not strictly trying to tile the universe with our DNA, but we are trying to turn it all into human things, and it’s not uncommon for people to be sad about the parts of the universe we can never reach and turn into humantronium.
We do not love or hate the cow/chicken/pig, but they are made of meat which can be turned into reward center triggers.
As to why we’re not exactly like a paperclip maximizer, I suspect one big piece is:
We’re not able to make direct copies of ourselves or extend our personal power to the extent that we expect AI to be able to, so “being nice” is adaptive because there are a lot of things we can’t do alone. We expect that an AI could just make itself bigger or make exact copies that won’t have divergent goals, so it won’t need this.
DNA could make exact copies of itself, yet it chooses to mix itself with another set. There might be similar pressures on minds to prevent exploitation of blind spots.
From the perspective of the God of Evolution, we are the unfriendly AI:
We were supposed to be compelled to reproduce, but we figure out that we can get the reward by disabling our reproductive functions and continuing to go through the motions.
We were supposed to seek out nutritious food and eat it, but we figured out that we could concentrate the parts that trigger our reward centers and just eat that.
And of course, we’re unfriendly to everything else too:
Humans fight each other over farmland (= land that can be turned into food which can be turned into humans) all the time
We’re trying to tile the universe with human colonies and probes. It’s true that we’re not strictly trying to tile the universe with our DNA, but we are trying to turn it all into human things, and it’s not uncommon for people to be sad about the parts of the universe we can never reach and turn into humantronium.
We do not love or hate the cow/chicken/pig, but they are made of meat which can be turned into reward center triggers.
As to why we’re not exactly like a paperclip maximizer, I suspect one big piece is:
We’re not able to make direct copies of ourselves or extend our personal power to the extent that we expect AI to be able to, so “being nice” is adaptive because there are a lot of things we can’t do alone. We expect that an AI could just make itself bigger or make exact copies that won’t have divergent goals, so it won’t need this.
DNA could make exact copies of itself, yet it chooses to mix itself with another set. There might be similar pressures on minds to prevent exploitation of blind spots.