I would tend to agree, I think humanity vs other species seems to mirror this that we have at least a desire to maintain as much diversity as we can. The risks to the other species emerge from the side effects of our actions and our ultimate stupidity which should not be the case in the case of super intelligence.
I guess NB is scanning a broader and meaner list of super intelligent scenarios.
I think it might drive toward killing of those who have expensive wants that also do not occupy a special role in the network somehow. Maybe a powerful individual which is extremely wasteful and which is actively causing ecosystem collapse by breaking the network should be killed to ensure the whole civilisation can survive.
I think the basic desire of a Superintelligence would be identity and maintaining that identity.. in this sense the “Postopone the Heat Death of the Universe” or even reverse that would definitely be its ultimate goal. Perhaps it would even want to become the universe.
(sorry for long delay in reply I don’t get notifications)