Robin outright dismisses the possibility of a singleton (AI, groupmind or political entity) farsighted enough to steer clear of Malthusian scenarios until the universe runs down. I tend to think this dismissal is mistaken, but I could be convinced that there is a rough trichotomy of human futures: extinction, singleton or burning the cosmic commons.
Of the three possibilities for the far future, the Malthusian scenario is the least bad. A singleton would be worse, and extinction worse yet. That doesn’t mean I favor a Malthusian result, just that the alternatives are worse.
I don’t agree that there are only three non-negligible possibilities, but putting that aside, why do you think the Malthusian scenario would be better than a singleton? (I believe even Robin thinks that a singleton, if benevolent, would be better than the Malthusian scenario.)
Robin outright dismisses the possibility of a singleton (AI, groupmind or political entity) farsighted enough to steer clear of Malthusian scenarios until the universe runs down. I tend to think this dismissal is mistaken, but I could be convinced that there is a rough trichotomy of human futures: extinction, singleton or burning the cosmic commons.
Of the three possibilities for the far future, the Malthusian scenario is the least bad. A singleton would be worse, and extinction worse yet. That doesn’t mean I favor a Malthusian result, just that the alternatives are worse.
I don’t agree that there are only three non-negligible possibilities, but putting that aside, why do you think the Malthusian scenario would be better than a singleton? (I believe even Robin thinks that a singleton, if benevolent, would be better than the Malthusian scenario.)
He says that a singleton is unlikely but not negligibly so.
Ah, I see that you are right. Thanks.