there is not a chance in hell that 1.5 million people would voluntarily build a robot army for a tyrant
There are so many ways that a tyrant could end up with a robot army. Don’t let’s pretend that that’s the only way. Here are a few:
A country is in turmoil and a leader comes along who makes people feel hope. The people are open to “lesser evil” propositions and risk-taking because they are desperate. They make irrational decisions and empower the wrong person. Hitler is a real life actual example of this happening.
A leader who is thought of as “good” builds a killer robot army. Then, realizing that they have total power over their people corrupts them and they behave like a tyrant, effectively turning into an oppressive dictator.
Hypothetical scenario: The setting is a country with presidential elections (I choose America for this one). Hypothetically, in this scenario we’ll say the technology to do this was completely ready to be exploited. So the government begins to build a killer robot army. Hypothetically, a good president happens to be in office, so people think it’s okay. We’ll say that president gets a second term. Eight years pass, and a significant killer robot army is created. It’s powerful enough to kill every American. Now, it’s time to change the president. Maybe the American people choose somebody with their best interests in mind. Maybe they choose a wolf in sheep’s clothing, or a moron who doesn’t understand the dangers. It’s not like we haven’t elected morons before and it isn’t as if entire countries full of people have never empowered anyone dangerous. I think it’s reasonable to say that there’s at least a 5% chance that each election will yield either a fatally moronic person, an otherwise good person who is susceptible to being seriously corrupted if given too much power, someone with a tyrant’s values/personality, or a sociopath. If you’re thinking to yourself “how many times in American history have we seen a president go corrupt by power” consider that there have been checks and balances in place to prevent them from having enough power to be corrupted by. In my opinion, it’s likely that most of them would be corrupted by the kind of absolute power that a killer robot army would give them, and 5% is actually quite a low estimate compared with my model of how reality works. But for hypothetical purposes, we’ll pretend it’s only as high as 5%. We roll the dice on that 5% chance every four years because we hold elections again. If we added those 5% chances up over the course of the rest of my life, we’d end up with it being more likely than not (62.5%) that the wrong person will end up having total control over the country I live in.
Do you see now how a tyrant or other undesirable leader could very conceivably end up heading a killer robot army?
[1]: 1⁄200 US population average microsoft salary = 150 billion USD. This would require many, many years of work- given how long the military has worked on predators, probably decades. So it would require trillions of dollars.
Thank you. I am very glad for these figures. How long do you think it would take for the US government to build 100 million killer robots?
Also, I don’t think you understand sociopathy. The 1⁄20 figure you cited should be 1⁄25, which refers to the DSM’s “antisocial personality disorder;”
Not sure why we have different numbers but: The statistics for America are different from the statistics for other countries (so, depending on whether your source is aiming for a global figure or local figure, this can vary), the statistic probably changes over time, the DSM changes over time, there are multiple sources on this that probably do not agree, and the 1⁄20 figure is based on research I did ten years ago, so something in there probably explains it. A 1% difference in prevalence is irrelevant here since (in my extremely amateurish, shoot-from-the-hip estimate “just to get some perspective”) if 1 in 200 people are willing to work on the robot army, that’s enough—and 1⁄25 is obviously significantly larger.
sociopathy is a deficit in moral reasoning, which is very different from being a person who’s just waiting to become a minion to some dictator.
Ah, but if you want to be a tyrant you don’t need minions who have been dreaming of becoming a minion. Consider this—most people who are employed at a job didn’t dream of becoming what they are. There are a lot of people working as cashiers, grocery baggers, doing boring work in a factory, working in telemarketing, etc who dislike and even despise their jobs. Why do they do them? Money, obviously.
Why don’t those people turn into drug dealers? They’d make more money that way.
Ethics!
Those people have a sense of right and wrong, or at least are successfully coerced by laws.
People with antisocial personality disorder, the way the DSM defines it, have neither of these properties.
You said yourself above that most people wouldn’t want to build a robot army for a tyrant. I agree. But a sociopath doesn’t give a rat’s behind how they get their money. That is why they are more likely to work for a tyrant—they don’t have a conscience and they don’t care about the law. If they can make more money assembling killer robots than flipping burgers, there’s nothing to stop them from taking the killer robot job.
Taking this into consideration, do you think sociopaths could end up building killer robots?
It seems to me like you’re outlining four different scenarios:
1) The United States, or another major power, converts from manned to unmanned weapons of war. A military coup is impossible today because soldiers won’t be willing to launch one; were soldiers to be replaced by robots, they could be ordered to.
2) Another state develops unmanned weapons systems which enable it to defeat the United States.
3) A private individual develops unmanned weapons systems which enable them to defeat the United States.
4) Another state which is already a dictatorship develops unmanned weapons systems which alow the dictator to remain in power.
My interpretation of your original comment was that you were arguing for #3; that is the only context in which hiring sociopaths would be relevant, as normal weapons development clearly doesn’t require hiring a swarm of sociopathic engineers. The claim that dictatorships exclusively or primarily rely on sociopaths is factually wrong. e.g. according to data from Order Police Battalion 101, 97% of an arbitrary sample of Germans under Hitler were willing to take guns and mow down civilians. Certainly, close to 100% of an arbitrary sample of people would be willing to work on developing robots for either the US or any other state- we can easily see this today.
If you were arguing for #2, then my response would be that the presence of unmanned weapons systems wouldn’t make a different one way or another- if we’re positing another state able to outdevelop, then defeat, the US, it would presumably be able to do so anyways. The only difference would be if it had an enormous GDP but low population; but such a state would be unlikely to be an aggressive military dictatorship, and, anyways, clearly doesn’t exist.
For #4, current dictatorships are too far behind in terms of technological development for unmanned weapons systems to have a significant impact- what we see today is that the most complex weapons systems are produced in a few. mostly stable and democratic nations, and there’s good reason to think that democracy is caused by economic and technological development, such that the states that are most able to produce unmanned weapons are also the most likely to already be democratic. (More precisely, are the most likely to be democratic by the time they build enough unmanned weapons, which seems to be decades off at a minimum.) Worst case, there are 1-3 states (Iran, Russia, China[*]) likely to achieve the capacity to build their own unmanned weapons systems without being democracies; and even then, it’s questionable whether unmanned weapons systems would be able to do all that much. (It depends on the exact implementation, of course; but in general, no robots can assure the safety of a dictator, and they cannot stop the only way recent Great Power dictatorships have loosened up, by choice of their leaders.)
[*] This is a list of every country that is a dictatorship or quasi-dictatorship that’s built it’s own fighters, bombers, or tanks, minus Pakistan. I’m very confident that China’s government already has enough stability/legitimacy concerns and movement towards democracy that they would implement safeguards. Iran and Russia I give ~50% chances each of doing so.
If you were arguing for #1, then a) the US has well-established procedures for oversight of dangerous weapons (i.e. WMD) which have never failed, b) it would be much easier for the President and a small cabal to gain control using nukes than robots, c) the President doesn’t actually have direct control of the military- the ability to create military plans and respond to circumstances for large groups of military robots almost certainly requires AGI, d) as noted separately, there never be a point, pre-AGI, where robots are actually independent of people, e) conspiracies as a general rule are very rarely workable, and this hypothetical conspiracy seems even less workable than most, because it requires many people working together over at least several decades, each person ascending to an elite position.
How long do you think it would take for the US government to build 100 million killer robots?
I don’t believe that it ever will. If the US spends 6,000 USD in maintenance per robot, that would eat up the entire US military budget. 6,000$ is almost certainly a severe underestimate of the cost of operating them, by roughly 3 orders of magnitude, and anyways, that neglects a huge number of relevant factors: the cost of purchasing an MQ-1, amortized over a 30-year operating period, is roughly the same per year as the operating cost; money also needs to be spent doing R&D; non-robot fixed costs total at least 6% of the US military budget; much of US military spending is on things like carriers or aerial refueling or transports whose robot equivalents wouldn’t be ‘killer robots’; etc. (The military budget may go up over time, but the cost per plane has risen faster than the military budget since WWII, so if anything this also argues against large numbers of robots.)
An alternative date: I would expect the USAF to be majority unmanned by 2040 +- 5 years (50% bounds, most uncertainty above); this is roughly one lifecycle of planes forward from today. (Technically it’s a fair bit less; but I’d expect development to speed up somewhat.)
I would expect the US Army to deploy unmanned ground combat units in serious numbers by 2035 +- 5 years.
I would expect the USAF to remove humans from the decision making loop on an individual plane’s flights in 2045 +- 5 years; on a squadron, including maintenance, command, etc, 2065 +- 5 years; above that, never.
How long do you think it would take for the US government to build 100 million killer robots?
I don’t believe that it ever will.
Technologies become less expensive over time, and as we progress, our wealth grows. If we don’t have the money to produce it at the current cost, that doesn’t mean they’ll never be able to afford to do it.
If the US spends 6,000 USD in maintenance per robot, that would eat up the entire US military budget. 6,000$ is almost certainly a severe underestimate of the cost of operating them, by roughly 3 orders of magnitude
You didn’t specify a time period—should I assume that’s yearly? Also, do they have to pay $6,000 in maintenance costs while the units are in storage?
and anyways, that neglects a huge number of relevant factors: the cost of purchasing an MQ-1, amortized over a 30-year operating period, is roughly the same per year as the operating cost; money also needs to be spent doing R&D; non-robot fixed costs total at least 6% of the US military budget; much of US military spending is on things like carriers or aerial refueling or transports whose robot equivalents wouldn’t be ‘killer robots’; etc. (The military budget may go up over time, but the cost per plane has risen faster than the military budget since WWII, so if anything this also argues against large numbers of robots.)
Okay, so an MQ-1 is really, really expensive. Thank you.
An alternative date: I would expect the USAF to be majority unmanned by 2040 +- 5 years (50% bounds, most uncertainty above); this is roughly one lifecycle of planes forward from today. (Technically it’s a fair bit less; but I’d expect development to speed up somewhat.) I would expect the US Army to deploy unmanned ground combat units in serious numbers by 2035 +- 5 years.
What is “serious numbers”?
I would expect the USAF to remove humans from the decision making loop on an individual plane’s flights in 2045 +- 5 years; on a squadron, including maintenance, command, etc, 2065 +- 5 years; above that, never.
What do you mean by “above that, never”?
Sorry I didn’t get to your other points today. I don’t have enough time.
P.S. How did you get these estimates for when unmanned weapons will come out?
There are so many ways that a tyrant could end up with a robot army. Don’t let’s pretend that that’s the only way. Here are a few:
A country is in turmoil and a leader comes along who makes people feel hope. The people are open to “lesser evil” propositions and risk-taking because they are desperate. They make irrational decisions and empower the wrong person. Hitler is a real life actual example of this happening.
A leader who is thought of as “good” builds a killer robot army. Then, realizing that they have total power over their people corrupts them and they behave like a tyrant, effectively turning into an oppressive dictator.
Hypothetical scenario: The setting is a country with presidential elections (I choose America for this one). Hypothetically, in this scenario we’ll say the technology to do this was completely ready to be exploited. So the government begins to build a killer robot army. Hypothetically, a good president happens to be in office, so people think it’s okay. We’ll say that president gets a second term. Eight years pass, and a significant killer robot army is created. It’s powerful enough to kill every American. Now, it’s time to change the president. Maybe the American people choose somebody with their best interests in mind. Maybe they choose a wolf in sheep’s clothing, or a moron who doesn’t understand the dangers. It’s not like we haven’t elected morons before and it isn’t as if entire countries full of people have never empowered anyone dangerous. I think it’s reasonable to say that there’s at least a 5% chance that each election will yield either a fatally moronic person, an otherwise good person who is susceptible to being seriously corrupted if given too much power, someone with a tyrant’s values/personality, or a sociopath. If you’re thinking to yourself “how many times in American history have we seen a president go corrupt by power” consider that there have been checks and balances in place to prevent them from having enough power to be corrupted by. In my opinion, it’s likely that most of them would be corrupted by the kind of absolute power that a killer robot army would give them, and 5% is actually quite a low estimate compared with my model of how reality works. But for hypothetical purposes, we’ll pretend it’s only as high as 5%. We roll the dice on that 5% chance every four years because we hold elections again. If we added those 5% chances up over the course of the rest of my life, we’d end up with it being more likely than not (62.5%) that the wrong person will end up having total control over the country I live in.
Do you see now how a tyrant or other undesirable leader could very conceivably end up heading a killer robot army?
Thank you. I am very glad for these figures. How long do you think it would take for the US government to build 100 million killer robots?
Not sure why we have different numbers but: The statistics for America are different from the statistics for other countries (so, depending on whether your source is aiming for a global figure or local figure, this can vary), the statistic probably changes over time, the DSM changes over time, there are multiple sources on this that probably do not agree, and the 1⁄20 figure is based on research I did ten years ago, so something in there probably explains it. A 1% difference in prevalence is irrelevant here since (in my extremely amateurish, shoot-from-the-hip estimate “just to get some perspective”) if 1 in 200 people are willing to work on the robot army, that’s enough—and 1⁄25 is obviously significantly larger.
Ah, but if you want to be a tyrant you don’t need minions who have been dreaming of becoming a minion. Consider this—most people who are employed at a job didn’t dream of becoming what they are. There are a lot of people working as cashiers, grocery baggers, doing boring work in a factory, working in telemarketing, etc who dislike and even despise their jobs. Why do they do them? Money, obviously.
Why don’t those people turn into drug dealers? They’d make more money that way.
Ethics!
Those people have a sense of right and wrong, or at least are successfully coerced by laws.
People with antisocial personality disorder, the way the DSM defines it, have neither of these properties.
You said yourself above that most people wouldn’t want to build a robot army for a tyrant. I agree. But a sociopath doesn’t give a rat’s behind how they get their money. That is why they are more likely to work for a tyrant—they don’t have a conscience and they don’t care about the law. If they can make more money assembling killer robots than flipping burgers, there’s nothing to stop them from taking the killer robot job.
Taking this into consideration, do you think sociopaths could end up building killer robots?
It seems to me like you’re outlining four different scenarios:
1) The United States, or another major power, converts from manned to unmanned weapons of war. A military coup is impossible today because soldiers won’t be willing to launch one; were soldiers to be replaced by robots, they could be ordered to.
2) Another state develops unmanned weapons systems which enable it to defeat the United States.
3) A private individual develops unmanned weapons systems which enable them to defeat the United States.
4) Another state which is already a dictatorship develops unmanned weapons systems which alow the dictator to remain in power.
My interpretation of your original comment was that you were arguing for #3; that is the only context in which hiring sociopaths would be relevant, as normal weapons development clearly doesn’t require hiring a swarm of sociopathic engineers. The claim that dictatorships exclusively or primarily rely on sociopaths is factually wrong. e.g. according to data from Order Police Battalion 101, 97% of an arbitrary sample of Germans under Hitler were willing to take guns and mow down civilians. Certainly, close to 100% of an arbitrary sample of people would be willing to work on developing robots for either the US or any other state- we can easily see this today.
If you were arguing for #2, then my response would be that the presence of unmanned weapons systems wouldn’t make a different one way or another- if we’re positing another state able to outdevelop, then defeat, the US, it would presumably be able to do so anyways. The only difference would be if it had an enormous GDP but low population; but such a state would be unlikely to be an aggressive military dictatorship, and, anyways, clearly doesn’t exist.
For #4, current dictatorships are too far behind in terms of technological development for unmanned weapons systems to have a significant impact- what we see today is that the most complex weapons systems are produced in a few. mostly stable and democratic nations, and there’s good reason to think that democracy is caused by economic and technological development, such that the states that are most able to produce unmanned weapons are also the most likely to already be democratic. (More precisely, are the most likely to be democratic by the time they build enough unmanned weapons, which seems to be decades off at a minimum.) Worst case, there are 1-3 states (Iran, Russia, China[*]) likely to achieve the capacity to build their own unmanned weapons systems without being democracies; and even then, it’s questionable whether unmanned weapons systems would be able to do all that much. (It depends on the exact implementation, of course; but in general, no robots can assure the safety of a dictator, and they cannot stop the only way recent Great Power dictatorships have loosened up, by choice of their leaders.)
[*] This is a list of every country that is a dictatorship or quasi-dictatorship that’s built it’s own fighters, bombers, or tanks, minus Pakistan. I’m very confident that China’s government already has enough stability/legitimacy concerns and movement towards democracy that they would implement safeguards. Iran and Russia I give ~50% chances each of doing so.
If you were arguing for #1, then a) the US has well-established procedures for oversight of dangerous weapons (i.e. WMD) which have never failed, b) it would be much easier for the President and a small cabal to gain control using nukes than robots, c) the President doesn’t actually have direct control of the military- the ability to create military plans and respond to circumstances for large groups of military robots almost certainly requires AGI, d) as noted separately, there never be a point, pre-AGI, where robots are actually independent of people, e) conspiracies as a general rule are very rarely workable, and this hypothetical conspiracy seems even less workable than most, because it requires many people working together over at least several decades, each person ascending to an elite position.
I don’t believe that it ever will. If the US spends 6,000 USD in maintenance per robot, that would eat up the entire US military budget. 6,000$ is almost certainly a severe underestimate of the cost of operating them, by roughly 3 orders of magnitude, and anyways, that neglects a huge number of relevant factors: the cost of purchasing an MQ-1, amortized over a 30-year operating period, is roughly the same per year as the operating cost; money also needs to be spent doing R&D; non-robot fixed costs total at least 6% of the US military budget; much of US military spending is on things like carriers or aerial refueling or transports whose robot equivalents wouldn’t be ‘killer robots’; etc. (The military budget may go up over time, but the cost per plane has risen faster than the military budget since WWII, so if anything this also argues against large numbers of robots.)
An alternative date: I would expect the USAF to be majority unmanned by 2040 +- 5 years (50% bounds, most uncertainty above); this is roughly one lifecycle of planes forward from today. (Technically it’s a fair bit less; but I’d expect development to speed up somewhat.)
I would expect the US Army to deploy unmanned ground combat units in serious numbers by 2035 +- 5 years.
I would expect the USAF to remove humans from the decision making loop on an individual plane’s flights in 2045 +- 5 years; on a squadron, including maintenance, command, etc, 2065 +- 5 years; above that, never.
Technologies become less expensive over time, and as we progress, our wealth grows. If we don’t have the money to produce it at the current cost, that doesn’t mean they’ll never be able to afford to do it.
You didn’t specify a time period—should I assume that’s yearly? Also, do they have to pay $6,000 in maintenance costs while the units are in storage?
Okay, so an MQ-1 is really, really expensive. Thank you.
What is “serious numbers”?
What do you mean by “above that, never”?
Sorry I didn’t get to your other points today. I don’t have enough time.
P.S. How did you get these estimates for when unmanned weapons will come out?