The personal consequences are there. The’re staring you in the face with every job in translation, customer service, design, transportation, logistics, that gets automated in such a way that there is no value you can possibly add to it
...
2-3 Years ago I was on track to becoming a pretty good illustrator, and that would have been a career I would have loved to pursue. When I saw the progress AI was making in that area—and I was honest with myself about this quite a bit earlier than other people, who are still going through the bargaining stages now -, I was disoriented and terrified in a way quite different from the ‘game’ of worrying about some abstract, far-away threat
This is the thing to worry about. There are real negative consequences to machine learning today, sitting inside the real negative consequences of software’s dominance, and we can’t stop the flat fact that a life of work is going away for most people. The death cult vibe is the wild leap. It does not follow that AI is going to magically gain the power to gain the power to gain the power to kill humanity faster than we can stop disasters.
There are specific technical arguments about why AI might rapidly kill everyone. You can’t figure out if those arguments are true or false by analysing the “death cult vibes”.
Now you can take the position that death cult vibes are unhealthy and not particularly helpful. Personally I haven’t actually seen a lot of death cult vibes. I have seen more “fun mental toy from philosophy land” vibes. Where total doom is discussed as if it were a pure maths problem. But if there are death cult vibes somewhere I haven’t seen, those probably don’t help much.
This has Arrested Development energy ^_^ https://pbs.twimg.com/media/FUHfiS7X0AAe-XD.jpg
This is the thing to worry about. There are real negative consequences to machine learning today, sitting inside the real negative consequences of software’s dominance, and we can’t stop the flat fact that a life of work is going away for most people. The death cult vibe is the wild leap. It does not follow that AI is going to magically gain the power to gain the power to gain the power to kill humanity faster than we can stop disasters.
There are specific technical arguments about why AI might rapidly kill everyone. You can’t figure out if those arguments are true or false by analysing the “death cult vibes”.
Now you can take the position that death cult vibes are unhealthy and not particularly helpful. Personally I haven’t actually seen a lot of death cult vibes. I have seen more “fun mental toy from philosophy land” vibes. Where total doom is discussed as if it were a pure maths problem. But if there are death cult vibes somewhere I haven’t seen, those probably don’t help much.