If you think (as I do) that a machine superintelligence singleton is largely inevitable (Bostrom forthcoming)
I can grant the “machine superintelligence” part as largely inevitable, but why “singleton”? Are you suggesting that Bostrom has a good argument for the inevitability of such a singleton, that he hasn’t written down anywhere except in his forthcoming book?
To some degree, yes (I’ve seen a recent draft). But my point goes through without this qualification, so I’ve edited my original comment to remove “singleton.”
I can grant the “machine superintelligence” part as largely inevitable, but why “singleton”? Are you suggesting that Bostrom has a good argument for the inevitability of such a singleton, that he hasn’t written down anywhere except in his forthcoming book?
To some degree, yes (I’ve seen a recent draft). But my point goes through without this qualification, so I’ve edited my original comment to remove “singleton.”