In my rambling, I intended to address some of these issues but chose to cap it off at a point I found satisfying.
The first point: simply put, I do not see the necessary labor an AGI would need to bring about the full potential of its capability requiring any more than 10% of the labor force. This is an arbitrary number with no basis in reality.
On the second point, I do not believe we need to see even more than 30% unemployment before severe societal pressure is put on the tech companies and government to do something. This isn’t quite as arbitrary, as unemployment rates as “low” as 15% have been triggers for severe social unrest.
Assuming only half of that is automated within five years due to a good bit of that still requiring physical robots, you already have caused enough pain to get the government involved.
However, I do predict that there will be SOME material capability in the physical world. My point is more that the potential for a rebellion to be crushed solely through robotics capabilities alone will not be there, as most robotic capabilities will indeed be deployed for labor.
I suppose, the point there is that there is going to be a “superchilled” point of robotics capabilities at around the exact same time AGI is likely to arrive— the latter half of the 2020s, a point when robotics are advanced enough to do labor and deployed in a large enough scale to do so, but not to such an overwhelming point that literally every possible physical job is automated. Hence why I kept the estimates down to around 50% unemployment at most, though possibly as high as 70% if companies aggressively try futureproofing themselves for whatever reason.
Furthermore, I’m going more with the news that companies are beginning to utilize generative AI to automate their workforce (mostly automating tasks at this point, but which will inevitably generalize to whole positions). This despite the technology not yet being fully mature for deployment (e.g. ChatGPT, Stable Diffusion/Midjourney, etc.) https://finance.yahoo.com/news/companies-already-replacing-workers-chatgpt-140000856.html
If it’s feasible for companies to save some money via automation, they are wont to take it. Likewise, I expect plenty of businesses to automate ahead of time in the near future as a result of AI hype.
The third point is one which I intended to address more directly indeed: that the prospect of a loss of material comfort and stability is in fact a suitable emotional and psychological shock that can drive unrest and, given enough uncertainty, a revolution. We saw this as recently as the COVID lockdowns in 2020 and the protests that arose following that March (for various reasons). We’ve seen reactions to job loss be similarly violent at earlier points in history. Some of this was buffered by the prevalence of unions, but we’ve successfully deunionized en masse.
It should also be stressed that we in the West have not had to deal with such intense potential permanent unemployment. In America and the UK, the last time the numbers were anywhere near “30%” was during the Great Depression. Few people in those times expected such numbers to remain so high indefinitely. Yet in our current situation, we’re not just expecting 30% to be the ceiling; we’re expecting it to be the floor, and to eventually reach 100% unemployment (or at least 99.99%).
I feel most people wouldn’t mind losing their jobs if they were paid for it. I feel most people wouldn’t mind having comfortable stability through robot-created abundance. I merely present a theory that all of this change coming too fast to handle, before we’re properly equipped to handle it, in a culture that does not at all value or prepare us for a lifestyle anywhere similar to what is being promised, is going to end very badly.
There are any number of other things which might already have caused a society-wide luddite revolt—nuclear weapons, climate change, Internet surveillance—but it hasn’t happened.
The fundamental issue is that none of these have had a direct negative impact on the financial, emotional, and physical wellbeing of hundreds of millions of people all at once. Internet surveillance is the closest, but even then, it’s a somewhat abstract privacy concern; climate change eventually will, but not soon enough for most people to care about— this scenario, however, would be actively tangibly happening, and at accelerando speeds. I’d also go so far as to say these issues merely built up like a supervolcanic caldera over the decades, as many people do care about these issues, but there has not been a major trigger to actually protest en masse as part of a Luddite revolt over them.
The situation I’m referring to is entirely the long-idealized “mass unemployment from automation,” and current trends suggest this is going to happen very quickly rather than over longer timeframes. If there has ever been a reason for a revolt, taking away people’s ability to earn income and put food on the table is it.
I expect there will be a token effort to feed people to prevent revolt, but the expectation that things are not going to change only to be faced by the prospect of wild, uncontrollable change will be the final trigger. The promise that “robots are coming to give you abundance” is inevitably going to go down badly. It’ll inevitably be a major culture war topic, and one that I don’t think enough people will believe even in the face of AI and robotic deployment. And again, that’s not bringing up the psychosocial response to all this where you have millions upon millions who would feel horribly betrayed by the prospect of their expected future immediately going up in smoke, their incomes being vastly reduced, and the prospect of death (whether that be by super-virus, disassembly, or mind-uploading, the latter of which is indistinguishable to death for the layman). And good lord, that’s not even bringing up cultural expectations, religious beliefs, and entrenched collective dogma.
The only possible way to avoid this is to time it perfectly right. Don’t automate much right up until AGI’s unveiling. Then, while people are horribly shocked, automate as much as possible, and then deploy machines to increase abundance.
Of course, the AGI likely kills everyone instead, but if it works, you might be able to stave off a Luddite rebellion if there is enough abundance to satisfy material comforts. But this is an almost absurd trickshot that requires capitalists stop acting like capitalists for several years, then discard capitalism entirely afterwards.
In my rambling, I intended to address some of these issues but chose to cap it off at a point I found satisfying.
The first point: simply put, I do not see the necessary labor an AGI would need to bring about the full potential of its capability requiring any more than 10% of the labor force. This is an arbitrary number with no basis in reality.
On the second point, I do not believe we need to see even more than 30% unemployment before severe societal pressure is put on the tech companies and government to do something. This isn’t quite as arbitrary, as unemployment rates as “low” as 15% have been triggers for severe social unrest.
As it stands, roughly 60% of the American economy is wrapped up in professional work: https://www.dpeaflcio.org/factsheets/the-professional-and-technical-workforce-by-the-numbers
Assuming only half of that is automated within five years due to a good bit of that still requiring physical robots, you already have caused enough pain to get the government involved.
However, I do predict that there will be SOME material capability in the physical world. My point is more that the potential for a rebellion to be crushed solely through robotics capabilities alone will not be there, as most robotic capabilities will indeed be deployed for labor.
I suppose, the point there is that there is going to be a “superchilled” point of robotics capabilities at around the exact same time AGI is likely to arrive— the latter half of the 2020s, a point when robotics are advanced enough to do labor and deployed in a large enough scale to do so, but not to such an overwhelming point that literally every possible physical job is automated. Hence why I kept the estimates down to around 50% unemployment at most, though possibly as high as 70% if companies aggressively try futureproofing themselves for whatever reason.
Furthermore, I’m going more with the news that companies are beginning to utilize generative AI to automate their workforce (mostly automating tasks at this point, but which will inevitably generalize to whole positions). This despite the technology not yet being fully mature for deployment (e.g. ChatGPT, Stable Diffusion/Midjourney, etc.)
https://finance.yahoo.com/news/companies-already-replacing-workers-chatgpt-140000856.html
If it’s feasible for companies to save some money via automation, they are wont to take it. Likewise, I expect plenty of businesses to automate ahead of time in the near future as a result of AI hype.
The third point is one which I intended to address more directly indeed: that the prospect of a loss of material comfort and stability is in fact a suitable emotional and psychological shock that can drive unrest and, given enough uncertainty, a revolution. We saw this as recently as the COVID lockdowns in 2020 and the protests that arose following that March (for various reasons). We’ve seen reactions to job loss be similarly violent at earlier points in history. Some of this was buffered by the prevalence of unions, but we’ve successfully deunionized en masse.
It should also be stressed that we in the West have not had to deal with such intense potential permanent unemployment. In America and the UK, the last time the numbers were anywhere near “30%” was during the Great Depression. Few people in those times expected such numbers to remain so high indefinitely. Yet in our current situation, we’re not just expecting 30% to be the ceiling; we’re expecting it to be the floor, and to eventually reach 100% unemployment (or at least 99.99%).
I feel most people wouldn’t mind losing their jobs if they were paid for it. I feel most people wouldn’t mind having comfortable stability through robot-created abundance. I merely present a theory that all of this change coming too fast to handle, before we’re properly equipped to handle it, in a culture that does not at all value or prepare us for a lifestyle anywhere similar to what is being promised, is going to end very badly.
The fundamental issue is that none of these have had a direct negative impact on the financial, emotional, and physical wellbeing of hundreds of millions of people all at once. Internet surveillance is the closest, but even then, it’s a somewhat abstract privacy concern; climate change eventually will, but not soon enough for most people to care about— this scenario, however, would be actively tangibly happening, and at accelerando speeds. I’d also go so far as to say these issues merely built up like a supervolcanic caldera over the decades, as many people do care about these issues, but there has not been a major trigger to actually protest en masse as part of a Luddite revolt over them.
The situation I’m referring to is entirely the long-idealized “mass unemployment from automation,” and current trends suggest this is going to happen very quickly rather than over longer timeframes. If there has ever been a reason for a revolt, taking away people’s ability to earn income and put food on the table is it.
I expect there will be a token effort to feed people to prevent revolt, but the expectation that things are not going to change only to be faced by the prospect of wild, uncontrollable change will be the final trigger. The promise that “robots are coming to give you abundance” is inevitably going to go down badly. It’ll inevitably be a major culture war topic, and one that I don’t think enough people will believe even in the face of AI and robotic deployment. And again, that’s not bringing up the psychosocial response to all this where you have millions upon millions who would feel horribly betrayed by the prospect of their expected future immediately going up in smoke, their incomes being vastly reduced, and the prospect of death (whether that be by super-virus, disassembly, or mind-uploading, the latter of which is indistinguishable to death for the layman). And good lord, that’s not even bringing up cultural expectations, religious beliefs, and entrenched collective dogma.
The only possible way to avoid this is to time it perfectly right. Don’t automate much right up until AGI’s unveiling. Then, while people are horribly shocked, automate as much as possible, and then deploy machines to increase abundance.
Of course, the AGI likely kills everyone instead, but if it works, you might be able to stave off a Luddite rebellion if there is enough abundance to satisfy material comforts. But this is an almost absurd trickshot that requires capitalists stop acting like capitalists for several years, then discard capitalism entirely afterwards.