I often see arguments on LessWrong similar to this, and I feel compelled to disagree.
1) The AI you describe is God-like. It can do anything at a lower cost than its competitors, and trade is pointless only if it can do anything at extremely low cost without sacrificing more important goals. Example: Hiring humans to clean its server room is fairly cheap for the AI if it is working on creating Heaven, so it would have to be unbelievably efficient to not find this trade attractive.
2) If the AI is God-like, an extremely small amount of charity is required to dramatically increase humanity’s standard of living. Will the S team give at least 0.0000001% of their resources to charity? Probably.
3) If the AI is God-like, and if the S team is motivated only by self-interest, why would they waste their time dealing with humans? They will inhabit their own paradise, and the rest of us will continue working and trading with each other.
The economic problems associated with AI seem to be relatively minor, and it pains me to see smart people wasting their time on them. Let’s first make sure AI doesn’t paperclip our light cone—can we agree this is the dominant concern?
If they really don’t care about humans, then the AI will use all the resources at its disposal to make sure the paradise is as paradisaical as possible. Humans are made of atoms, and atoms can be used to do calculations to figure out what paradise is best.
Although I find it unlikely that the S team would be that selfish. That’s a really tiny incentive to murder everyone.
I often see arguments on LessWrong similar to this, and I feel compelled to disagree.
1) The AI you describe is God-like. It can do anything at a lower cost than its competitors, and trade is pointless only if it can do anything at extremely low cost without sacrificing more important goals. Example: Hiring humans to clean its server room is fairly cheap for the AI if it is working on creating Heaven, so it would have to be unbelievably efficient to not find this trade attractive.
2) If the AI is God-like, an extremely small amount of charity is required to dramatically increase humanity’s standard of living. Will the S team give at least 0.0000001% of their resources to charity? Probably.
3) If the AI is God-like, and if the S team is motivated only by self-interest, why would they waste their time dealing with humans? They will inhabit their own paradise, and the rest of us will continue working and trading with each other.
The economic problems associated with AI seem to be relatively minor, and it pains me to see smart people wasting their time on them. Let’s first make sure AI doesn’t paperclip our light cone—can we agree this is the dominant concern?
If they really don’t care about humans, then the AI will use all the resources at its disposal to make sure the paradise is as paradisaical as possible. Humans are made of atoms, and atoms can be used to do calculations to figure out what paradise is best.
Although I find it unlikely that the S team would be that selfish. That’s a really tiny incentive to murder everyone.