I have always failed to see what problem, and this confuses me greatly. To me it seams obvious that any sane entity would be dominated by considerations like this.
In this particular situation thou, the probability increase of possibly donating the money to SIAI and increasing the probability of a friendly singularity that can hack out of the matirx and use the computing power to create 4^^^^4 units of fun way outweighs it thou. And since the size of the threat correlates with the size of this reward it’ll always increase so that this is true.
I have always failed to see what problem, and this confuses me greatly. To me it seams obvious that any sane entity would be dominated by considerations like this.
In this particular situation thou, the probability increase of possibly donating the money to SIAI and increasing the probability of a friendly singularity that can hack out of the matirx and use the computing power to create 4^^^^4 units of fun way outweighs it thou. And since the size of the threat correlates with the size of this reward it’ll always increase so that this is true.