When some day some people (or some things) build an AGI [...] Humans will already have been obsoleted for all jobs except, probably, those that for emotional reasons require interaction with another human
To rephrase my question, how confident are you of this, and why? It seems to me quite possible that by the time someone builds an AGI, there are still plenty of human jobs that have not been taken over by specialized algorithms due to humans not being smart enough to have invented the necessary specialized algorithms yet. Do you have a reason to think this can’t be true?
ETA: My reply is a bit redundant given Nesov’s sibling comment. I didn’t see his when I posted mine.
I am far more confident in it than I am in the AGI-is-important argument. Which of course isn’t anywhere close to saying that I am highly confident in it. Just that the evidence for AGI-is-unimportant far outweighs that for AGI-is-important.
To rephrase my question, how confident are you of this, and why? It seems to me quite possible that by the time someone builds an AGI, there are still plenty of human jobs that have not been taken over by specialized algorithms due to humans not being smart enough to have invented the necessary specialized algorithms yet. Do you have a reason to think this can’t be true?
ETA: My reply is a bit redundant given Nesov’s sibling comment. I didn’t see his when I posted mine.
I am far more confident in it than I am in the AGI-is-important argument. Which of course isn’t anywhere close to saying that I am highly confident in it. Just that the evidence for AGI-is-unimportant far outweighs that for AGI-is-important.