Andrix, if it is just a recoiling from that, then how do you explain Stalin, Mao, etc?
Yes, Nancy, as soon as an AI endorsed by Eliezer or me transcends to superintelligence, it will probably make a point of preventing any other AI from transcending, and there is indeed a chance that that will entail killing a few (probably very irresponsible) humans. It is very unlikely to entail the killing of millions, and I can go into that more if you want.
The points are that (1) self-preservation and staying in power is easy if you are the only superintelligence in the solar system and that (2) unlike a governing coalition of humans who believe the end justifies the means, a well-designed well-implemented superintelligence will not kill or oppress millions for a nominally prosocial end which is in reality a flimsy excuse for staying in power.
there is indeed a chance that that will entail killing a few (probably very irresponsible) humans
I disagree. Killing people to stop them doing bad stuff is only necessary given insufficient resources to prevent them from doing the bad stuff in a nicer way. If the FAI makes the tradeoff that expending those resources isn’t worth it, then it doesn’t sound very friendly to me.
Andrix, if it is just a recoiling from that, then how do you explain Stalin, Mao, etc?
Yes, Nancy, as soon as an AI endorsed by Eliezer or me transcends to superintelligence, it will probably make a point of preventing any other AI from transcending, and there is indeed a chance that that will entail killing a few (probably very irresponsible) humans. It is very unlikely to entail the killing of millions, and I can go into that more if you want.
The points are that (1) self-preservation and staying in power is easy if you are the only superintelligence in the solar system and that (2) unlike a governing coalition of humans who believe the end justifies the means, a well-designed well-implemented superintelligence will not kill or oppress millions for a nominally prosocial end which is in reality a flimsy excuse for staying in power.
I disagree. Killing people to stop them doing bad stuff is only necessary given insufficient resources to prevent them from doing the bad stuff in a nicer way. If the FAI makes the tradeoff that expending those resources isn’t worth it, then it doesn’t sound very friendly to me.