My evidence is not super strong, but I notice a few things:
There’s less political tension and infighting when there’s a clear enemy. Think about wartime.
There’s a whole political theory about creating ingroup cohesion based on defining the ingroup against the outgroup. This is how a number of nation-states and religions were congealed.
Lots of political infighting has ramped up over the last 30+ years. This period has also been a long period of peace with no threat of major power wars. Theory: people constructed an outgroup.
My theory is roughly that humans need an ingroup for a variety of reasons not detailed here, there’s no ingroup without an outgroup, thus they need an outgroup to define the ingroup. If no natural outgroup exists they’ll create it.
This is the basic intuition behind “war on X” framing of political topics. Making Drugs, or Cancer, or whatever the “outgroup” triggers that sense of us-vs-them. But it doesn’t work that well, because human brains are more complicated than that, and are highly tuned to the mix of competition and cooperation with other humans, not non-agentic things.
One of the first things people do in their conception of members of outgroups is to forget or deny their humanity. This step fails for things that already aren’t human, and I suspect will derail that path to cohesion.
One of the first things people do in their conception of members of outgroups is to forget or deny their humanity. This step fails for things that already aren’t human, and I suspect will derail that path to cohesion.
Humans are so fucked up.
“We need an enemy that we can believe is inhuman, so we can unite to fight it.”
“Okay, what about Death? That’s a logical choice considering that it is already trying to kill you...”
War framing leads to centralization of power. It allows those on the top to weaken their political enemies and that in turn results in less open conflicts.
This has advantages but also comes with it’s problem as dissenting perspectives about how to address problems get pushed out.
This is why I strongly believe a Hollywood-style alien or Terminator-AI attack would do incredible things for uniting humanity. Unfortunately, AGI irl is unlikely to present in such a way that would make it an easy thing to outgroup…
It seems like humans need an outgroup.
My evidence is not super strong, but I notice a few things:
There’s less political tension and infighting when there’s a clear enemy. Think about wartime.
There’s a whole political theory about creating ingroup cohesion based on defining the ingroup against the outgroup. This is how a number of nation-states and religions were congealed.
Lots of political infighting has ramped up over the last 30+ years. This period has also been a long period of peace with no threat of major power wars. Theory: people constructed an outgroup.
My theory is roughly that humans need an ingroup for a variety of reasons not detailed here, there’s no ingroup without an outgroup, thus they need an outgroup to define the ingroup. If no natural outgroup exists they’ll create it.
This is the basic intuition behind “war on X” framing of political topics. Making Drugs, or Cancer, or whatever the “outgroup” triggers that sense of us-vs-them. But it doesn’t work that well, because human brains are more complicated than that, and are highly tuned to the mix of competition and cooperation with other humans, not non-agentic things.
One of the first things people do in their conception of members of outgroups is to forget or deny their humanity. This step fails for things that already aren’t human, and I suspect will derail that path to cohesion.
Humans are so fucked up.
“We need an enemy that we can believe is inhuman, so we can unite to fight it.”
“Okay, what about Death? That’s a logical choice considering that it is already trying to kill you...”
“Nah, too inhuman.”
War framing leads to centralization of power. It allows those on the top to weaken their political enemies and that in turn results in less open conflicts.
This has advantages but also comes with it’s problem as dissenting perspectives about how to address problems get pushed out.
This is why I strongly believe a Hollywood-style alien or Terminator-AI attack would do incredible things for uniting humanity. Unfortunately, AGI irl is unlikely to present in such a way that would make it an easy thing to outgroup…