This is nicely thought out and composed. It seems like you’re asking the question: what will skills be worth in the new equilibrium? I think this is an intuitive question, but in this case it’s probably not a useful question to ask. The new equilibrium is that machines outcompete us dramatically for almost all jobs.
The only alternative is not believing that we’ll get actual general AI any time soon. If we do, it will handily outcompete humans at everything it’s allowed to do, and very quickly. So less than subsistence, is the very likely answer to how much we’ll make, for almost all trades.
The relevant question is how quickly jobs become almost worthless. We could ask how long MY job will keep me fed, but it’s probably more productive to think beyond that.
The most relevant question is: what are we all going to do about it? If we solve technical alignment, we’re still faced with a world with vastly increased productivity but vastly decreased job opportunities. Capitalism as we know it isn’t going to work. A gentle reshuffling, a la minimum basic income, probably won’t even cover it for the richest countries, let alone the world.
To keep everyone alive let alone happy, we need radical new plans for the transition from capitalism to post-scarcity.
I’ve tried to listen to everyone thinking about the job loss issue. I haven’t found anyone smart and credible who’ll even claim to have a good guess about transition rates and whether we’ll get economic collapse. To me, it seems quite likely we will. Maybe it’s time to start thinking about how we’ll survive the transition even if we get the heaven scenario from AI in the longer run.
What fraction of job loss can the economy sustain over a short time? If we lose 5% of jobs globally in two years, we might be fine. The actual numbers are probably much higher. I’m no economist, but it seems like we’re heading for a massive recession during the transition to a almost-no-work-for-humans “economy”.
Yeah. The way the world works now, if technical alignment work is successful, it will just lead to AIs that are aligned to making money or winning wars. There need to be AIs aligned to human flourishing, but nobody wants to spend money training those. OpenAI was started for this purpose, but got taken over by money interests.
This is nicely thought out and composed. It seems like you’re asking the question: what will skills be worth in the new equilibrium? I think this is an intuitive question, but in this case it’s probably not a useful question to ask. The new equilibrium is that machines outcompete us dramatically for almost all jobs.
The only alternative is not believing that we’ll get actual general AI any time soon. If we do, it will handily outcompete humans at everything it’s allowed to do, and very quickly. So less than subsistence, is the very likely answer to how much we’ll make, for almost all trades.
The relevant question is how quickly jobs become almost worthless. We could ask how long MY job will keep me fed, but it’s probably more productive to think beyond that.
The most relevant question is: what are we all going to do about it? If we solve technical alignment, we’re still faced with a world with vastly increased productivity but vastly decreased job opportunities. Capitalism as we know it isn’t going to work. A gentle reshuffling, a la minimum basic income, probably won’t even cover it for the richest countries, let alone the world.
To keep everyone alive let alone happy, we need radical new plans for the transition from capitalism to post-scarcity.
I’ve tried to listen to everyone thinking about the job loss issue. I haven’t found anyone smart and credible who’ll even claim to have a good guess about transition rates and whether we’ll get economic collapse. To me, it seems quite likely we will. Maybe it’s time to start thinking about how we’ll survive the transition even if we get the heaven scenario from AI in the longer run.
What fraction of job loss can the economy sustain over a short time? If we lose 5% of jobs globally in two years, we might be fine. The actual numbers are probably much higher. I’m no economist, but it seems like we’re heading for a massive recession during the transition to a almost-no-work-for-humans “economy”.
Yeah. The way the world works now, if technical alignment work is successful, it will just lead to AIs that are aligned to making money or winning wars. There need to be AIs aligned to human flourishing, but nobody wants to spend money training those. OpenAI was started for this purpose, but got taken over by money interests.
An AI aligned to making money is not that much better than one aligned to making paperclips.