To expand, the reason why this thesis is important nonetheless, is because I don’t believe that the best case scenario is likely or compatible with the way things currently are. Accidentally creating ASI is almost guaranteed to happen at one point or another. As such, the biggest points of investment should be:
Surviving the transitional period
Establishing mechanisms for negotiation in an equilibrium state
>It’s harder to get those (starting from Earth) than things on Earth, though.
It’s not that much harder, and we can make it harder to extract Earth’s resources (or easier to extract non-earth resources).
>Satisfying higher-level values has historically required us to do vast amounts of farming and strip-mining and other resource extraction.
This is true. However, there are also many organisms that are resilient even to our most brutal forms of farming. We should aim for that level of adaptability ourselves.
>It is barely “competition” for an ASI to take human resources. This does not seem plausible for bulk mass-energy.
This is true, but energy is only really scarce to humans, and even then their mass-energy requirements are absolutely laughable by comparison to the mass-energy in the rest of the cosmos. Earth is only 0.0003% of the total mass-energy in the solar system, and we only need to be marginally harder to disassemble than the rest of mass-energy to buy time.
>Right, but we still need lots of things the ASI also probably wants.
This is true, and it is more true at the early stages where ASI technological developments are roughly the same as those of humans. However, as ASI technology advances, it is possible for it to want inherently different things that we can’t currently comprehend.