Meanwhile, the intelligence of a single person, even a single genius, taken in isolation and only allowed to acquire limited resources actually is not all that dangerous.
While I broadly agree with this sentiment, I would like to disagree with this point.
I would consider even the creation of a single very smart human, with all human resourcefulness but completely alien values, to be a significant net loss to the world. If they represent 0.001% of the world’s aggregative productive capacity, I would expect this to make the world something like 0.001% worse (according to humane values) and 0.001% better (according to their alien values).
The situation is not quite so dire, if nothing else because of gains for trade (if our values aren’t in perfect tension) and the ability of the majority to stomp out the values of a minority if it is so inclined. But it’s in the right ballpark.
So while I would agree that broadly human capabilities are not a necessary condition for concern, I do consider them a sufficient condition for concern.
While I broadly agree with this sentiment, I would like to disagree with this point.
I would consider even the creation of a single very smart human, with all human resourcefulness but completely alien values, to be a significant net loss to the world. If they represent 0.001% of the world’s aggregative productive capacity, I would expect this to make the world something like 0.001% worse (according to humane values) and 0.001% better (according to their alien values).
The situation is not quite so dire, if nothing else because of gains for trade (if our values aren’t in perfect tension) and the ability of the majority to stomp out the values of a minority if it is so inclined. But it’s in the right ballpark.
So while I would agree that broadly human capabilities are not a necessary condition for concern, I do consider them a sufficient condition for concern.