I’d also say that humans are very misaligned with entities much less powerful than themselves, like slaves, animals, women, blacks, and more, and misalignment is the norm in history, not alignment.
I don’t think this is a particularly good argument in the case of humans, because a lot of the reasons for such domination in that special case has to do with a terminal value for it, not because it actually works to the instrumental benefit of the subjugators. There are plenty of economists who will tell you that America is better off for white males not having slaves and letting women get jobs. I also personally dislike using this kind of example-giving to normies because they then accuse me of anthropomorphizing. Better to look at what an AI system values, what it can do, and just say “hm, the AI values this nonhuman state of affairs more than the human state and oh look it can make that happen”.
True enough, and I’d agree here that I might be anthropomorphizing too much.
So the animal and slaves examples (like factory farms or plausibly hunting/habitat destruction.) is a useful case of instrumental convergence, where getting healthy diets and making money are the instrumental values that result in catastrophe for animals and slaves.
Also, slavery was profitable, at least in my opinion, so much so that it funded effectively the majority of America’s wealth thanks to the cotton gin, which allowed massive wealth to be extracted from slaves.
I don’t think this is a particularly good argument in the case of humans, because a lot of the reasons for such domination in that special case has to do with a terminal value for it, not because it actually works to the instrumental benefit of the subjugators. There are plenty of economists who will tell you that America is better off for white males not having slaves and letting women get jobs. I also personally dislike using this kind of example-giving to normies because they then accuse me of anthropomorphizing. Better to look at what an AI system values, what it can do, and just say “hm, the AI values this nonhuman state of affairs more than the human state and oh look it can make that happen”.
True enough, and I’d agree here that I might be anthropomorphizing too much.
So the animal and slaves examples (like factory farms or plausibly hunting/habitat destruction.) is a useful case of instrumental convergence, where getting healthy diets and making money are the instrumental values that result in catastrophe for animals and slaves.
Also, slavery was profitable, at least in my opinion, so much so that it funded effectively the majority of America’s wealth thanks to the cotton gin, which allowed massive wealth to be extracted from slaves.
Here’s a link: https://faculty.weber.edu/kmackay/economics of slavery.asp#:~:text=Slavery seemed enormously profitable.,stimulate the nation’s early industrialization.
Another link, albeit more polemic than the last link: https://www.vox.com/identities/2019/8/16/20806069/slavery-economy-capitalism-violence-cotton-edward-baptist