I think a very simple and general pessimistic take is “AI will make human thinking irrelevant”. It almost doesn’t matter if it happens by subversion or just normal economic activity. One way or another, we’ll end up with a world where human thinking is irrelevant, and nobody has described a good world like that.
The only good scenarios are where human thinking somehow avoids “habitat destruction”. Maybe by uplifting, or explicit “habitat preservation”, or something else. But AI companies are currently doing the opposite, reallocating more and more tasks to AI outright, so it’s hard to be an optimist.
I think a very simple and general pessimistic take is “AI will make human thinking irrelevant”. It almost doesn’t matter if it happens by subversion or just normal economic activity. One way or another, we’ll end up with a world where human thinking is irrelevant, and nobody has described a good world like that.
The only good scenarios are where human thinking somehow avoids “habitat destruction”. Maybe by uplifting, or explicit “habitat preservation”, or something else. But AI companies are currently doing the opposite, reallocating more and more tasks to AI outright, so it’s hard to be an optimist.