That argument may even be correct—a sufficiently advanced intelligence may see just how much less-interesting matter there is to exploit before the optimization question of “a tiny bit of resources to keep some ruins and the critters who made them around” vs “another few percent of matter to make into computronium or whatever the super-AGI version of paperclips is”.
And then that extends to preserving part of the solar system while exploiting other star systems.
I don’t put a LOT of weight behind that argument—not only is it pretty tenuous (we don’t have any clue how many humans or in what condition the AI will decide is valuable enough to keep—note that we haven’t kept very many ancient ruins), but it ignores the ramp-up problem—the less-capable versions that are trying to get smart and powerful enough to get off of Earth (and then out of the solar system) in the first place.
I would agree with this. The easiest way to “encourage” ASI to leave humans alone would be for humans to arm themselves with the most powerful weapons they can produce, helped by the strongest AI models humans can reliably control. This matter needs to fight back, see Geohot.
That argument may even be correct—a sufficiently advanced intelligence may see just how much less-interesting matter there is to exploit before the optimization question of “a tiny bit of resources to keep some ruins and the critters who made them around” vs “another few percent of matter to make into computronium or whatever the super-AGI version of paperclips is”.
And then that extends to preserving part of the solar system while exploiting other star systems.
I don’t put a LOT of weight behind that argument—not only is it pretty tenuous (we don’t have any clue how many humans or in what condition the AI will decide is valuable enough to keep—note that we haven’t kept very many ancient ruins), but it ignores the ramp-up problem—the less-capable versions that are trying to get smart and powerful enough to get off of Earth (and then out of the solar system) in the first place.
I would agree with this. The easiest way to “encourage” ASI to leave humans alone would be for humans to arm themselves with the most powerful weapons they can produce, helped by the strongest AI models humans can reliably control. This matter needs to fight back, see Geohot.