I would tend to agree, I think humanity vs other species seems to mirror this that we have at least a desire to maintain as much diversity as we can. The risks to the other species emerge from the side effects of our actions and our ultimate stupidity which should not be the case in the case of super intelligence.
I guess NB is scanning a broader and meaner list of super intelligent scenarios.
I would tend to agree, I think humanity vs other species seems to mirror this that we have at least a desire to maintain as much diversity as we can. The risks to the other species emerge from the side effects of our actions and our ultimate stupidity which should not be the case in the case of super intelligence.
I guess NB is scanning a broader and meaner list of super intelligent scenarios.
Perhaps—a broader list of more narrow AIs