As I understand the original posting and Eliezer’s response to it, the problem is not that some over-delicate souls might be distressed at a hypothetical danger. The (alleged) real problem is far worse: it is that thinking about these scenarios is the very thing that makes you vulnerable to them. And to twist the knife further, the problem isn’t limited to UFAIs. You might end up being tortured by a FAI, if you didn’t manage to think about these things in just the right way. Better to remain safely ignorant—if you can, having read just this much.
I can’t resist pointing out a religious analogue. There is a Christian belief that people who lived and died without the opportunity to hear the Word of God may still be saved if they nevertheless lived good lives in ignorance of the divine commandment. (Historically, I think the purpose of this doctrine was to protect the writings of the ancient Greeks and Romans from wholesale condemnation and destruction, but that’s by the way.) However, people who have had the opportunity to hear the Good News but reject it are damned without mercy. In God’s eyes they are worse than the most depraved of those who were ignorant through no fault of their own.
Surely that depends on exactly what you define “friendly” to mean.
It certainly seems to. Somewhere on my list of “ways to stop an AI from torturing me for 10 million years” is “find anyone who is in the process of creating an AI that will torture me and kill them”. I’m not overly concerned what name they give it.
Since Eliezer considers it rational to prefer TORTURE to SPECKS, an FAI to his specification would presumably do the same. In either case, too bad if you’re the one who gets TORTUREd. Maybe 3^^^3 people to SPECK will never be created, but what is one person compared with even the mere bazillions that FAI-assisted humanity might produce in mere billions of years? You need to make very sure you’re one of the elect before creating God.
As I understand the original posting and Eliezer’s response to it, the problem is not that some over-delicate souls might be distressed at a hypothetical danger. The (alleged) real problem is far worse: it is that thinking about these scenarios is the very thing that makes you vulnerable to them. And to twist the knife further, the problem isn’t limited to UFAIs. You might end up being tortured by a FAI, if you didn’t manage to think about these things in just the right way. Better to remain safely ignorant—if you can, having read just this much.
I can’t resist pointing out a religious analogue. There is a Christian belief that people who lived and died without the opportunity to hear the Word of God may still be saved if they nevertheless lived good lives in ignorance of the divine commandment. (Historically, I think the purpose of this doctrine was to protect the writings of the ancient Greeks and Romans from wholesale condemnation and destruction, but that’s by the way.) However, people who have had the opportunity to hear the Good News but reject it are damned without mercy. In God’s eyes they are worse than the most depraved of those who were ignorant through no fault of their own.
Some “Good News”, and some “Friendliness”!
Surely that depends on exactly what you define “friendly” to mean.
It certainly seems to. Somewhere on my list of “ways to stop an AI from torturing me for 10 million years” is “find anyone who is in the process of creating an AI that will torture me and kill them”. I’m not overly concerned what name they give it.
Since Eliezer considers it rational to prefer TORTURE to SPECKS, an FAI to his specification would presumably do the same. In either case, too bad if you’re the one who gets TORTUREd. Maybe 3^^^3 people to SPECK will never be created, but what is one person compared with even the mere bazillions that FAI-assisted humanity might produce in mere billions of years? You need to make very sure you’re one of the elect before creating God.
The parallels with Christian theology just keep coming. Thanks to Timeless Decision Theory, you were either saved or damned from the beginning. When you attain to the correct dispositions to be immune to counterfactual blackmail, you do not become saved, but discover that you always were. And do not delay, for “Every day brings you nearer to everlasting torments or felicity.” “Your transgressions have sent up to heaven a cry for vengeance. You are actually under the curse of the Almighty.” The Bible makes a lot of sense read as a garbled account of an AI that played around with the human race for a while and then went away.
Which brings us back to… who is creating this unfriendly AI that is going to torture me and where do they live?
Probably the same people who push fat people under trolleys. I wonder what sort of AI Peter Singer would want to create?