The truly interesting thing here is that I would agree unequivocally with you if you were talking about any other kind of ‘cult of the apocalypse’.
These cults don’t have to be based on religious belief in the old-fashioned sense, in fact, most cults of this kind that really took off in the 20th and 21st century are secular.
Since around the late 1800s, there has been a certain type of student that externalizes their (mostly his) unbearable pain and dread, their lack of perspective and meaning in life into ‘the system’, and throw themselves into the noble cause of fighting capitalism.
Perhaps one or two decades ago, there was a certain kind of teenager that got absorbed in online discussions about about science vs religion, 9/11, big pharma, the war economy—in this case I can speak from my own experience and say that for me this definitely was a means of externalizing my pain.
Today, at least in my country, for a lot of teenagers, climate change has saturated this mimetic-ecological niche.
In each of these cases, I see the dynamic as purely pathological.
But. And I know what you’re thinking. But still, but. In the case of technological progress and its consequences for humanity, the problem isn’t abstract, in the way these other problems are.
The personal consequences are there.
The’re staring you in the face with every job in translation, customer service, design, transportation, logistics, that gets automated in such a way that there is no value you can possibly add to it.
They’re on the horizon, with all the painfully personal problems that are coming our way in 10-20 years.
I’m not talking about the apocalypse here, I don’t mind whatshisface’s Basilisk or utility maximizers turning us all into paperclips—these are cute intellectual problems and there might be something to them, but ultimately if the world ends that’s noone’s problem.
2-3 Years ago I was on track to becoming a pretty good illustrator, and that would have been a career I would have loved to pursue. When I saw the progress AI was making in that area—and I was honest with myself about this quite a bit earlier than other people, who are still going through the bargaining stages now -, I was disoriented and terrified in a way quite different from the ‘game’ of worrying about some abstract, far-away threat. And I couldn’t get out of that mode, until I was able to come up with a strategy, at least for myself.
If this problem gets to the point, where there just isn’t a strategy I can take to avoid having to acknowledge my own irrelevance—because we’ve invented machines that are, somehow, better at all the things we find value in and value ourselves for than the vast majority of us can possibly hope to be -, I think I’ll be able to make my peace with that, but it’s because I understand the problem well enough to know what a terminal diagnosis will look like.
Unlike war, poverty and other injustices, humans replacing themselves is a true civilization-level existential problem, not in the sense that it threatens our subsistence, but that it threatens the very way we conceive of ourselves.
Once you acknowledge that, then yes.
I agree with your core point.
It’s time to walk away. There’s nothing you can do about technological progress, and the world will not become a better place for your obsessing over it.
But you still need to know that your career as a translator or programmer or illustrator won’t be around long enough for it to amount to a life plan. You need to understand how the reality of the problem will affect you, so that you can go on living while doing what you need to do to stay away from it.
Like not building a house somewhere that you expect will be flooded in 30 years.
The personal consequences are there. The’re staring you in the face with every job in translation, customer service, design, transportation, logistics, that gets automated in such a way that there is no value you can possibly add to it
...
2-3 Years ago I was on track to becoming a pretty good illustrator, and that would have been a career I would have loved to pursue. When I saw the progress AI was making in that area—and I was honest with myself about this quite a bit earlier than other people, who are still going through the bargaining stages now -, I was disoriented and terrified in a way quite different from the ‘game’ of worrying about some abstract, far-away threat
This is the thing to worry about. There are real negative consequences to machine learning today, sitting inside the real negative consequences of software’s dominance, and we can’t stop the flat fact that a life of work is going away for most people. The death cult vibe is the wild leap. It does not follow that AI is going to magically gain the power to gain the power to gain the power to kill humanity faster than we can stop disasters.
There are specific technical arguments about why AI might rapidly kill everyone. You can’t figure out if those arguments are true or false by analysing the “death cult vibes”.
Now you can take the position that death cult vibes are unhealthy and not particularly helpful. Personally I haven’t actually seen a lot of death cult vibes. I have seen more “fun mental toy from philosophy land” vibes. Where total doom is discussed as if it were a pure maths problem. But if there are death cult vibes somewhere I haven’t seen, those probably don’t help much.
but ultimately if the world ends that’s noone’s problem.
This is an interesting claim. If I had a planet destroying weapon that would leave the ISS astronauts alive, would you say “don’t worry about it much, it’s only 3 astronaut’s problem”?
I wasn’t trying to speak to this part. But now that you have, I’m glad you did. I don’t mean to dismiss the very real impacts that this tech is having on people’s lives.
That’s just a different thing than what I was talking about. Not totally unrelated, but a fair bit off to the side.
The truly interesting thing here is that I would agree unequivocally with you if you were talking about any other kind of ‘cult of the apocalypse’.
These cults don’t have to be based on religious belief in the old-fashioned sense, in fact, most cults of this kind that really took off in the 20th and 21st century are secular.
Since around the late 1800s, there has been a certain type of student that externalizes their (mostly his) unbearable pain and dread, their lack of perspective and meaning in life into ‘the system’, and throw themselves into the noble cause of fighting capitalism.
Perhaps one or two decades ago, there was a certain kind of teenager that got absorbed in online discussions about about science vs religion, 9/11, big pharma, the war economy—in this case I can speak from my own experience and say that for me this definitely was a means of externalizing my pain.
Today, at least in my country, for a lot of teenagers, climate change has saturated this mimetic-ecological niche.
In each of these cases, I see the dynamic as purely pathological. But. And I know what you’re thinking. But still, but. In the case of technological progress and its consequences for humanity, the problem isn’t abstract, in the way these other problems are.
The personal consequences are there. The’re staring you in the face with every job in translation, customer service, design, transportation, logistics, that gets automated in such a way that there is no value you can possibly add to it. They’re on the horizon, with all the painfully personal problems that are coming our way in 10-20 years.
I’m not talking about the apocalypse here, I don’t mind whatshisface’s Basilisk or utility maximizers turning us all into paperclips—these are cute intellectual problems and there might be something to them, but ultimately if the world ends that’s noone’s problem.
2-3 Years ago I was on track to becoming a pretty good illustrator, and that would have been a career I would have loved to pursue. When I saw the progress AI was making in that area—and I was honest with myself about this quite a bit earlier than other people, who are still going through the bargaining stages now -, I was disoriented and terrified in a way quite different from the ‘game’ of worrying about some abstract, far-away threat. And I couldn’t get out of that mode, until I was able to come up with a strategy, at least for myself.
If this problem gets to the point, where there just isn’t a strategy I can take to avoid having to acknowledge my own irrelevance—because we’ve invented machines that are, somehow, better at all the things we find value in and value ourselves for than the vast majority of us can possibly hope to be -, I think I’ll be able to make my peace with that, but it’s because I understand the problem well enough to know what a terminal diagnosis will look like.
Unlike war, poverty and other injustices, humans replacing themselves is a true civilization-level existential problem, not in the sense that it threatens our subsistence, but that it threatens the very way we conceive of ourselves.
Once you acknowledge that, then yes.
I agree with your core point.
It’s time to walk away. There’s nothing you can do about technological progress, and the world will not become a better place for your obsessing over it.
But you still need to know that your career as a translator or programmer or illustrator won’t be around long enough for it to amount to a life plan. You need to understand how the reality of the problem will affect you, so that you can go on living while doing what you need to do to stay away from it.
Like not building a house somewhere that you expect will be flooded in 30 years.
This has Arrested Development energy ^_^ https://pbs.twimg.com/media/FUHfiS7X0AAe-XD.jpg
This is the thing to worry about. There are real negative consequences to machine learning today, sitting inside the real negative consequences of software’s dominance, and we can’t stop the flat fact that a life of work is going away for most people. The death cult vibe is the wild leap. It does not follow that AI is going to magically gain the power to gain the power to gain the power to kill humanity faster than we can stop disasters.
There are specific technical arguments about why AI might rapidly kill everyone. You can’t figure out if those arguments are true or false by analysing the “death cult vibes”.
Now you can take the position that death cult vibes are unhealthy and not particularly helpful. Personally I haven’t actually seen a lot of death cult vibes. I have seen more “fun mental toy from philosophy land” vibes. Where total doom is discussed as if it were a pure maths problem. But if there are death cult vibes somewhere I haven’t seen, those probably don’t help much.
This is an interesting claim. If I had a planet destroying weapon that would leave the ISS astronauts alive, would you say “don’t worry about it much, it’s only 3 astronaut’s problem”?
I agree.
I wasn’t trying to speak to this part. But now that you have, I’m glad you did. I don’t mean to dismiss the very real impacts that this tech is having on people’s lives.
That’s just a different thing than what I was talking about. Not totally unrelated, but a fair bit off to the side.